Work-Life Balance

merlin_148254948_1bce07ca-1a39-451b-b88d-0e6acf013db0-superJumbo

This is the time of year for New Year’s resolutions, and since the majority of adult Americans work, many of us make promises about work, mainly to be calm and not frazzled.

That’s why a piece, “4 Reasons We’re Frazzled at Work,” caught my eye. As I read the article, I found myself saying, “So that’s why!” Here’s how the writer began:

Your better mind knows exactly how to manage your time better at work but a primal, seemingly uncontrollable urge to do the opposite overtakes you.

You know you should say no when you’re asked to take on that new project, but you say yes. Or you know your boss said your report was good enough, but you work until midnight perfecting it. Or you’re just stuck — wanting to do better but unsure that trying will help — so you do nothing.

If you are frustrated with your seemingly irrational behavior, the root issue may be deep subconscious programming known as your “attachment style.” Your attachment style dictates how you relate to other people, particularly in situations that trigger stress.

The good news is that many work places are providing on-site yoga classes for their employees. Here’s how a recent piece put it:

I have always been a type-A person — I like structure, planning and efficiency — and while that has certainly helped me get a lot done, it has also sometimes pushed me to do things too quickly, to be impatient and to miss opportunities to learn through listening.

Yoga has been the counter to that motor — even when I am upside-down in a headstand. The practice of yoga involves breathing, meditation and postures, sometimes physically challenging ones and sometimes poses that are challenging in their simplicity — like just being still. I have been practicing yoga for nearly two decades, after being drawn to the physical comfort the stretches brought me as a teenager, and completed my 200-hour teaching training in 2011. In recent years, I have brought that practice to The New York Times, where I have worked for over four years and am now a director of communications.

This is just a snippet. Want more? You can read the full articles here and here.

Why Gig?

19hyman-jumbo

As you hold your smart phone and consider how it has changed your life, you could be inclined to think that the tech industry alone has created the gig economy. But you would be wrong.

The gig economy is enabled by technology, but technology didn’t create it, it was a result of the insecure nature of work today – which is a far cry from baby-boomers’ parents who went to work for one company and retired at 65 with their gold watch.

I read one of the best explanations of this change in piece entitled: “The Gig Economy Isn’t the iPhone’s Fault. Here’s how it began:

When we learn about the Industrial Revolution in school, we hear a lot about factories, steam engines, maybe the power loom. We are taught that technological innovation drove social change and radically reshaped the world of work.

Likewise, when we talk about today’s economy, we focus on smartphones, artificial intelligence, apps. Here, too, the inexorable march of technology is thought to be responsible for disrupting traditional work, phasing out the employee with a regular wage or salary and phasing in independent contractors, consultants, temps and freelancers — the so-called gig economy.

But this narrative is wrong. The history of labor shows that technology does not usually drive social change. On the contrary, social change is typically driven by decisions we make about how to organize our world. Only later does technology swoop in, accelerating and consolidating those changes.

This insight is crucial for anyone concerned about the insecurity and other shortcomings of the gig economy. For it reminds us that far from being an unavoidable consequence of technological progress, the nature of work always remains a matter of social choice. It is not a result of an algorithm; it is a collection of decisions by corporations and policymakers.

Want more? You can read the full article here

A New World Order

07books-list-lead-superJumbo-v3

With tens of thousands of new books on the market, deciding what to read is getting more and more challenging. Friends recommend books and we get inputs from multiple sources.

That’s why I gravitate to the New York Times best-seller lists in the Sunday Book Review section, as well as their periodic lists of critics’ top choices.

Here is how the latest list of top books begins:

If we had to use a single word to describe the past year in books, it might be eclectic. Novels were told from the perspective of a woman imprisoned for murder, a woman who suddenly inherits a Great Dane and a woman having an affair with a writer who strongly resembles Philip Roth. We also got an esteemed literary biographer turning her lens on herself, a sprawling, fresh look at New York’s postwar art world and clear-eyed advice about how to die. As in 2017, some of the year’s best nonfiction addressed global tumult — but a bit more subtly, in several cases, by casting an eye back to distant but still-resonant history, like the decades of deferral and denial that led to the Civil War. Below, The New York Times’s three daily book critics — Dwight Garner, Parul Sehgal and Jennifer Szalai — share their thoughts about their favorites among the books they reviewed this year, each list alphabetical by author.

Want more? You can read it here

Selling You

25whippman-superJumbo

Remember when there were salesmen – perhaps those people who went door-to-door selling vacuum cleaners – and the rest of us? That line is now blurred – and perhaps completely erased,

 

I had that inkling as I got more and more requests for blurbs, book reviews, likes, follows etc., but wasn’t able to really clarify what was going on until I read a piece, “We’re all in Sales Now.” Here’s how it began:

 

There is something about the consumer madness of the holiday season that makes me think of my friend Rebecca’s mother. When I was in middle school, she had a side hustle selling acrylic-rhinestone bug brooches. The jewelry was hard to move on its merits — even for the 1980s it was staggeringly ugly. But what she lacked in salable product, she made up for in sheer selling stamina. Every sleepover, school fair or birthday party, out would come the tray of bejeweled grasshoppers and stag beetles, glinting with Reagan-era menace.

Presumably, someone was making money from this venture — some proto-Trump barking orders from his tax haven — but it certainly didn’t seem to be Rebecca’s mother, whose sales pitches took on an ever more shrill note of desperation.

Soon she had given up even the basic social pretense that we might actually want the brooches. The laws of supply and demand morphed seamlessly into the laws of guilt and obligation, and then into the laws of outright malice, mirroring the trajectory of capitalism itself.

At that time, when naked hawking to your friends was still considered an etiquette blunder, the sales pitches by Rebecca’s mother felt embarrassing — as gaudy and threatening to the social ecosystem as a purple rhinestone daddy longlegs. But 30 years later, at the height of the gig economy, when the foundation of working life has apparently become selling your friends things they don’t want, I look back to that raw need in Rebecca’s mother’s eyes with something terrifyingly approaching recognition.

Want more? You can read the full article here

Tech Anniversary

shutterstock_1135450259

San Francisco in 1968 was littered with flower children, free love and dreams of utopia encapsulated in Timothy Leary’s exhortation: “Turn on, tune in, drop out.” How wrong that was! But out of this purple haze rose that year’s Joint Computer Conference, where an assembly of geniuses wearing white short-sleeved shirts and pocket protectors convened 50 years ago this week. The event shined a guiding light on the path to personal computing and set the modern world in motion.

On Dec. 9, 1968, Doug Engelbart of the Stanford Research Institute presented what’s now known as “The Mother of All Demos.” Using a homemade modem, a video feed from Menlo Park, and a quirky hand-operated device, Engelbart gave a 90-minute demonstration of hypertext, videoconferencing, teleconferencing and a networked operating system. Oh, and graphical user interface, display editing, multiple windows, shared documents, context-sensitive help and a digital library. Mother of all demos is right. That quirky device later became known as the computer mouse. The audience felt as if it had stepped into Oz, watching the world transform from black-and-white to color. But it was no hallucination.

So what have we learned in 50 years? First, augmenting humans is the purpose of technology and ought not be feared. Engelbart described the possibilities in a 1970 paper. “There will emerge a new ‘marketplace,’ representing fantastic wealth in commodities of knowledge, service, information, processing, storage,” he predicted. “In the number and range of transactions, and in the speed and flexibility with which they are negotiated, this new market will have a vitality and dynamism as much greater than today’s as today’s is greater than the village market.” Today Google is Memex 1.0, while Amazon and a whole world of e-commerce have realized the digital market.

World’s Policeman?

merlin_109648445_1807158e-1216-4789-b121-f25afab27762-jumbo

Most Americans feel somewhere deep in their gut that it is futile for the United States to try to be world’s policeman – but many of us have trouble articulating why that is a bad thing.

A review of Stephan Walt’s new book, “The Hell of Good Intentions” helped me understand just how badly we stumble when we try to be everything to everybody. Here’s an excerpt:

Like Edmund Burke, who warned, “I dread our own power and our own ambition; I dread our being too much dreaded,” Walt views America’s recurrent bouts of missionary zeal with consternation. Others, like the foreign policy writer Robert Kagan, may fret about an encroaching jungle invading the gardens of the West; Walt’s attitude is to forget about trying to trim it back. As a longstanding member of the realist school of foreign policy, which has traditionally subordinated considerations about human rights and morality to a balance of power, Walt might be expected to wax enthusiastic about Donald Trump, who has espoused a “principled realism” and condemned the foreign policy establishment. Walt, however, exhibits as much disdain for Trump’s bellicosity as he does for the liberal internationalists that he indicts here. Walt’s book offers a valuable contribution to the mounting debate about America’s purpose. But his diagnosis of America’s debilities is more persuasive than his prescriptions to remedy them.

According to Walt, the dominant narrative after the conclusion of the Cold War was that history was on America’s side, even, as Francis Fukuyama put it in a famous 1989 essay in The National Interest, that so-called history had ended and all that remained was economic materialism. Globalization would lead to what Karl Marx had called in the Communist Manifesto a “universal interdependence” among nations; warfare would become a thing of the past. America’s mission was to push other states to protect human rights and to help them transition to democracy.

In Walt’s view, “despite minor differences, both liberal and neoconservative proponents of liberal hegemony assumed that the United States could pursue this ambitious global strategy without triggering serious opposition.” But the very steps that America took to enhance its security, Walt suggests, ended up undermining it. He reminds us, for instance, that George F. Kennan warned in 1999 that NATO expansion eastward was a “tragic mistake” that would, sooner or later, ignite Russian nationalism. Under Vladimir Putin’s leadership, Russia became a revanchist power that launched cyber-attacks on the Baltic States, seized Crimea, invaded Ukraine and interfered in the 2016 American presidential election. In Walt’s telling, “the energetic pursuit of liberal hegemony was mostly a failure. … By 2017, in fact, democracy was in retreat in many places and under considerable strain in the United States itself.”

Want more? You can read more here

President George H.W. Bush

01ghwbush-edt2-jumbo-v2

The well-deserved tributes to President George H.W. Bush have dominated the news for the past week. One article leads my list as words that best captured what he meant to our country. Here’s how it began:

Historians will measure the presidency of George H.W. Bush in familiar ways — by how well or poorly he managed the major domestic and international challenges of his time, his leadership qualities, the moral and social legacies he left for future generations.

Mr. Bush’s death on Friday is also a moment to recall a less quarrelsome political order, when relations with traditional allies were more cordial than combative, when government attracted people of talent and integrity for whom public service offered a purpose higher than self-enrichment, when the Republican Party, though slowly slipping into the tentacles of zealots like Newt Gingrich, still offered room for people with pragmatic policies and sensible dispositions.

This is just a snippet. Want more? You can read the full article here

Minecraft Overdrive

17mag-minecraft-1-articleLarge

Few dispute that games mirror reality and that reality often resembles games. But the game of Minecraft takes this to new levels. And this is driving innovation. Here’s why:

Since its release seven years ago, Minecraft has become a global sensation, captivating a generation of children. There are over 100 million registered players, and it’s now the third-best-­selling video game in history, after Tetris and Wii Sports. In 2014, Microsoft bought Minecraft — and Mojang, the Swedish game studio behind it — for $2.5 billion.

There have been blockbuster games before, of course. But as Jordan’s experience suggests — and as parents peering over their children’s shoulders sense — Minecraft is a different sort of

For one thing, it doesn’t really feel like a game. It’s more like a destination, a technical tool, a cultural scene, or all three put together: a place where kids engineer complex machines, shoot videos of their escapades that they post on YouTube, make art and set up servers, online versions of the game where they can hang out with friends. It’s a world of trial and error and constant discovery, stuffed with byzantine secrets, obscure text commands and hidden recipes. And it runs completely counter to most modern computing trends. Where companies like Apple and Microsoft and Google want our computers to be easy to manipulate — designing point-and-click interfaces under the assumption that it’s best to conceal from the average user how the computer works — Minecraft encourages kids to get under the hood, break things, fix them and turn mooshrooms into random-­number generators. It invites them to tinker.

In this way, Minecraft culture is a throwback to the heady early days of the digital age. In the late ’70s and ’80s, the arrival of personal computers like the Commodore 64 gave rise to the first generation of kids fluent in computation. They learned to program in Basic, to write software that they swapped excitedly with their peers. It was a playful renaissance that eerily parallels the embrace of Minecraft by today’s youth. As Ian Bogost, a game designer and professor of media studies at Georgia Tech, puts it, Minecraft may well be this generation’s personal computer.

Want more? You can read the full article here

A New World Order

shutterstock_1105752032

Who will determine the future of civilization as we know it? Many in the United States have become accustomed to this country being responsible for those decisions. But that is changing.

Last week I reported on China’s One Belt One Road initiative. Most recognize this as an initiative designed to lift hundreds of millions of Chinese citizens out of poverty.

Others look at it as a move by China to become the world’s dominant power. It remains to be seen what the outcome will be.

But others worry that the United States may be purposefully and deliberately stepping aside and making room – ample room – for China to step in. Here is how Kori Schake puts it:

Decades from now, we may look back at the first weeks of June 2018 as a turning point in world history: the end of the liberal order.

At a summit in Canada, the president of the United States rejected associating the country with “the rules-based international order” that America had built after World War II, and threatened the country’s closest allies with a trade war. He insulted the Canadian prime minister, and then, just a few days later, lavished praise on Kim Jong-un, the world’s most repressive dictator. Without consulting America’s allies in the region, he even reiterated his desire to withdraw American troops from South Korea.

Such reckless disregard for the security concerns of America’s allies, hostility to mutually beneficial trade and willful isolation of the United States is unprecedented. Yet this is the foreign policy of the Trump administration. Quite explicitly, the leader of the free world wants to destroy the alliances, trading relationships and international institutions that have characterized the American-led order for 70 years.

The administration’s alternative vision for the international order is a bare-knuckled assertion of unilateral power that some call America First; more colorfully, a White House official characterized it to The Atlantic as the “We’re America, Bitch” doctrine. This aggressive disregard for the interests of like-minded countries, indifference to democracy and human rights and cultivation of dictators is the new world Mr. Trump is creating. He and his closest advisers would pull down the liberal order, with America at its helm, that remains the best guarantor of world peace humanity has ever known. We are entering a new, terrifying era.

Want more? You can read it here

Decision Time!

02johnson-superJumbo

We make decisions every day – dozens, scores, or even hundreds. Our brains are constantly juggling a dizzying array of choices. Somehow we do this with ease.

 

But it’s the big decisions that often trip us up and leave us befuddled. That’s why Steven Johnson’s book: “How we Make the Decisions that Matter the Most” is being wildly hailed as a breakthrough in helping us cope with the act of deciding (see the review of his book here: https://www.nytimes.com/2018/10/02/books/review/steven-johnson-farsighted.html).

Johnson shared the highlights of his suggestions in a recent piece in the New York Times. Here’s how he began:

In July 1838, Charles Darwin, then 29, sat down to make a decision that would alter the course of his life. The decision he was wrestling with was not related to scientific questions about the origins of species. It was a different kind of decision — existential as well, but of a more personal nature: Should he get married?

Darwin’s method for making this decision would be recognizable to many of us today: He made a list of pros and cons. Under the heading “not marry” he noted the benefits of remaining a bachelor, including “conversation of clever men at clubs”; under “marry” he included “children (if it please God)” and “charms of music and female chitchat.”

Even if some of Darwin’s values seem dated, the journal entry is remarkable for how familiar it otherwise feels. Almost two centuries later, even as everything else in the world has changed, the pros-versus-cons list remains perhaps the only regularly used technique for adjudicating a complex decision. Why hasn’t the science of making hard choices evolved?

In fact, it has, but its insights have been underappreciated. Over the past few decades, a growing multidisciplinary field of research — spanning areas as diverse as cognitive science, management theory and literary studies — has given us a set of tools that we can use to make better choices. When you face a complex decision that requires a long period of deliberation, a decision whose consequences might last for years or even decades, you are no longer limited to Darwin’s simple list.

Want more? You can read the full article here