Mindfulness Meditation

BN-UL910_MEDITA_12S_20170728111545

Buddha meet Charles Darwin. This initially may seem to be an odd pair to put together, but not according to a recent article by Robert Wright, “The Meditation Cure.” Here is how he begins:

Much of Buddhism can be boiled down to a bad-news/good-news story. The bad news is that life is full of suffering and we humans are full of illusions. The good news is that these two problems are actually one problem: If we could get rid of our illusions—if we could see the world clearly—our suffering would end.

And there’s more good news: Buddhism offers tools for doing that job. A good example is the type of meditation known as mindfulness meditation, now practiced by millions of people in the U.S. and other places far from Buddhism’s Asian homeland. Mindfulness meditation, Buddhists say, can change our perspective on feelings such as anxiety and rage and thereby sap their power to warp our vision and make us suffer.

These claims—the bad news and the good—are more than two millennia old, but they’re now getting important support from evolutionary psychology, the modern study of how natural selection engineered the human mind. Evolutionary psychology gives Buddhism’s diagnosis of the human predicament a back story. It explains why humans are prone to illusions and to suffering and why the two problems are related. And this explanation can strengthen the Buddhist prescription, adding to the power of mindfulness meditation in particular.

Mindfulness meditation is an exercise in attention. It involves calming the mind—typically by focusing on the breath—and then using the resulting equanimity to observe things with unusual care and clarity. The things observed can include sounds, physical sensations or anything else in the field of awareness. But perhaps most important is the careful observation of feelings, because feelings play such a powerful role in guiding our perceptions, thoughts and behavior.

 

Want more? You can read the full article here.

A New Approach to Eating

0824FOOD-master1050

If you need more evidence regarding how much Silicon Valley has come to dominate our lives, you need only look at the statistics: In 2016, the five top U.S. companies based on market capitalization where all tech companies. That trend continues today.

Now this trend is moving into our kitchens. Amazon has bought Whole Foods. Some didn’t see this coming, but in a prescient article some years ago entitled, “Rethinking Eating,” here’s what the writer suggested:

Having radically changed the way we communicate, do research, buy books, listen to music, hire a car and get a date, Silicon Valley now aims to transform the way we eat. Just as text messages have replaced more lengthy discourse and digital vetting has diminished the slow and awkward evolution of intimacy, tech entrepreneurs hope to get us hooked on more efficient, algorithmically derived food.

Call it Food 2.0.

Following Steve Jobs’s credo that “people don’t know what they want until you show it to them,” a handful of high-tech start-ups are out to revolutionize the food system by engineering “meat” and “eggs” from pulverized plant compounds or cultured snippets of animal tissue. One company imagines doing away with grocery shopping, cooking and even chewing, with a liquid meal made from algae byproducts.

You can read the entire article here.

America and Asia

09Chang-master768-v2

Over five years ago, in his speech delivered to the Australian Parliament on November 17, 2011, then-U.S. President Barack Obama made the first official announcement of a change in U.S. security policy. He said:

Our new focus on this region reflects a fundamental truth–the United States has been, and always will be, a Pacific nation …Here, we see the future. With most of the world’s nuclear power and some half of humanity, Asia will largely define whether the century ahead will be marked by conflict or cooperation, needless suffering or human progress.

Since then, as I’ve reported on this website several times (see, for example)…

…and have written about in various national and international publications (here):

Few would argue that the point President Obama made in his speech, “The United States has been, and always will be, a Pacific nation,” is true, but it wasn’t until I read a book review by Gordon Chang, “Bibles and Ginseng,” in the New York Times that I understand not just why this is true, but also how it happened.

Read this short – and clarifying – review here.

Hurricane Dancing

20friedman-master768

There are years that are watershed years, times where just extrapolating the present into the future isn’t enough.

For most of us, these years come and go and only in hindsight so we put it all together and take stock about what just happened.

It wasn’t until I read Tom Friedman’s article, “Dancing in a Hurricane,” that I was able to reflect that 2007 was such a year. Here is part of what he said:

What the hell happened in and around 2007? 2007? That’s such an innocuous year. But look again.

Steve Jobs and Apple released the first iPhone in 2007, starting the smartphone revolution that is now putting an internet-connected computer in the palm of everyone on the planet. In late 2006, Facebook, which had been confined to universities and high schools, opened itself to anyone with an email address and exploded globally. Twitter was created in 2006, but took off in 2007. In 2007, Hadoop, the most important software you’ve never heard of, began expanding the ability of any company to store and analyze enormous amounts of unstructured data. This helped enable both Big Data and cloud computing. Indeed, “the cloud” really took off in 2007.

In 2007, the Kindle kicked off the e-book revolution and Google introduced Android. In 2007, IBM started Watson — the world’s first cognitive computer that today can understand virtually every paper ever written on cancer and suggest to doctors highly accurate diagnoses and treatment options. And have you ever looked at a graph of the cost of sequencing a human genome? It goes from $100 million in the early 2000s and begins to fall dramatically starting around … 2007.

Read more of this short – but revealing – article here.

Artificial Intelligence

mj17-aiblackbox1

You don’t have to pick up a technical journal to be exposed to articles about artificial intelligence, machine learning, autonomy, deep learning, and the like. This technology surrounds us today and is quickly becoming something we access on a daily basis – witness, SIRI, Alexa and other apps and things we not only use for convenience, but that we count on every day.

Because of the seemingly sudden ubiquity of artificial intelligence (commonly called AI) there is vastly more heat than light on this subject.

That’s why I found this MIT Technology Review article, “The Dark Secret at the Heart of AI,” so fascinating. It asks the crucial question, do we really know what AI is doing for us? Here are a few excerpts:

A car’s underlying AI technology, known as deep learning, has proved very powerful at solving problems in recent years, and it has been widely deployed for tasks like image captioning, voice recognition, and language translation. There is now hope that the same techniques will be able to diagnose deadly diseases, make million-dollar trading decisions, and do countless other things to transform whole industries.

But this won’t happen—or shouldn’t happen—unless we find ways of making techniques like deep learning more understandable to their creators and accountable to their users. Otherwise it will be hard to predict when failures might occur—and it’s inevitable they will.

This raises mind-boggling questions. As the technology advances, we might soon cross some threshold beyond which using AI requires a leap of faith. Sure, we humans can’t always truly explain our thought processes either—but we find ways to intuitively trust and gauge people. Will that also be possible with machines that think and make decisions differently from the way a human would? We’ve never before built machines that operate in ways their creators don’t understand. How well can we expect to communicate—and get along with—intelligent machines that could be unpredictable and inscrutable?

Want more on this fascinating subject? Follow the link to the article

And for a comprehensive report on the growing value of AI in business, read this Price, Waterhouse, Coopers report: “Sizing the prize What’s the real value of AI for your business and how can you capitalize?”

Existential Threat

RIM-8_Talos_Test_Firing

Few national security issues have dominated the news this year than the threat of ballistic missiles armed with weapons of mass destruction.

Opposition to missile defense has persisted since the 1980s, but the politics may be changing with technological progress and the rising threat from North Korean dictator Kim Jong Un’s nuclear weapons. Congress has an opportunity this summer to notch a rare bipartisan deal that enhances U.S. security.

Kim has already overseen more nuclear and missile tests than his father and grandfather combined, and the Defense Intelligence Agency warns that “if left on its current trajectory” Pyongyang will develop a capacity to hit Japan, Alaska, Hawaii or even the U.S. West Coast. The Trump Administration is pleading with China to stop the North, but Chinese leaders never seem to act and they’re even trying to block regional missile defenses in South Korea.

Opponents say missile defenses are too expensive given that interception might fail, so better to trust arms control and the deterrence of mutual-assured destruction. But arms talks with North Korea have been a fool’s errand since negotiator Robert Gallucci and Bill Clinton bought its promises in 1994.

Even a 50% chance of interception might increase deterrence by making the success of an enemy first strike more doubtful. North Koreans or other rogues also may not be rational actors who fear their own annihilation. U.S. leaders have a moral obligation to do more than let Kim Jong Un hold American cities hostage, and without defenses a pre-emptive military strike might be the only alternative.

To read more on this subject, see this link from the U.S. Naval Institute.

Brain at Work

21gray-master768

Many of us “of a certain age” recall school as a journey of memorization. Whether it was the table of elements, mathematical formulas, or the dates of historic events, paraphrasing the Nike ad, we “just did it.”

That has changed dramatically and from their earliest days of schooling, today’s kids figure they can just “Google it,” to uncover literally anything they want to know. Can we now just give oer brains a rest? Not so fast.

I was intrigued by an article in the New York Times entitled “You Still Need Your Brain.” Here is part of what it said:

Google is good at finding information, but the brain beats it in two essential ways. Champions of Google underestimate how much the meaning of words and sentences changes with context. Consider vocabulary. Every teacher knows that a sixth grader, armed with a thesaurus, will often submit a paper studded with words used in not-quite-correct ways, like the student who looked up “meticulous,” saw it meant “very careful,” and wrote “I was meticulous when I fell off the cliff.”

With the right knowledge in memory, your brain deftly puts words in context. Consider “Trisha spilled her coffee.” When followed by the sentence “Dan jumped up to get a rag,” the brain instantly highlights one aspect of the meaning of “spill” — spills make a mess. Had the second sentence been “Dan jumped up to get her more,” you would have thought instead of the fact that “spill” means Trisha had less of something. Still another aspect of meaning would come to mind had you read, “Dan jumped up, howling in pain.”

Read the full article here.

West to East

14Christensen-master315

Few topics are more timely than the relationship between the United States and China and that relationship is likely to dominate geopolitics throughout the 21st Century.

Perhaps because it is “topical,” a sea of commentators hold forth with theories about this relationship and there is often more heat than light on this subject.

That is why I found Thomas Christensen’s review of “Easterniation” so fascinating: it put the power balance between not just between the United States and China, but west and east into refreshing perspective. Here is part of what he had to say:

“Easternization” navigates the recent migration of economic, military and political power from the Western Hemisphere to the Eastern. Rachman repeatedly returns to that migration’s main engine — the rise of China — but his thesis is broader. He considers the rise of Asia as a whole, including the growing clout of India and the continuing importance of Japan, a nation that is not currently rising but remains wealthy, technologically sophisticated and economically linked to all continents. Rachman also explores the decreasing ability of Americans and Europeans to shape to their liking outcomes around the world. Relentlessly fair, he resists blaming Asia’s successes for Western problems and recognizes the West’s self-inflicted wounds.

Rachman’s wisdom about global history precludes cartoonish characterizations of “East” versus “West.” Western nations spent more blood and treasure fighting one another, especially in two massive world wars, than they did colluding to dominate others. Similarly, mistrust among Asian states today outstrips mistrust among them and the United States or Europe. Nor do political ideas provide a clear border between East and West. Europe has had more than its share of authoritarian regimes, so it is a stretch to consider the recent rise of illiberal nationalism in Europe as somehow a move “eastward.” Mainland Chinese propagandists rail against the “Western values” of multiparty democracy, a free press and independent courts, but some of the nation’s largest and most successful Asian neighbors — South Korea, Japan, Taiwan, Indonesia and India — are no less Eastern for enjoying all of those institutions. No one, including Rachman, really knows how to categorize Russia. Moscow has tried with limited success to cobble together a Eurasian union with former Soviet republics in Central Asia, but it sees itself as the European part of such a union. Russia’s recent diplomatic lean toward China has more to do with energy markets and the two authoritarian regimes’ shared aversion to American support for color revolutions and regime change than it does with either realpolitik alliance formation or Sino-Russian cultural affinity.

You can read the entire article here

Moment to Moment

21tierney-master768

Mindfulness and mindfulness meditation have been around for a while now, with more and more practitioners finding value in living in the moment, not dwelling on the past or worrying about the future. As one convert put it, “I don’t want to get to the end of my life and find out I didn’t show up for it. The mindfulness movement is growing, both in our personal lives as well as in the workplace.

Some wonder why mindfulness hasn’t caught fire more rapidly. I wondered too, before I read an interesting piece in the New York Times entitled, “We Aren’t Built to Live in the Moment.” The writers lay out a good case for why we don’t embrace mindfulness more enthusiastically, and it goes directly to what makes us human. Here’s part of what they say:

What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives.

A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present.

You can read this interesting article here.

Innovation on Steroids

ED-AW360_michae_GR_20170509164525

Everyone agrees innovation is good…saying it’s not is like arguing against motherhood and apple pie. But what is innovation? How do you know if you “have it?” If you want more, how do you get it?

I always wondered about those questions, until John Michaelson’s article in the Wall Street Journal helped answer those questions. His thesis – that the financial crisis opened the economy to new forms of growth which are about to start pouring down – is compelling.

To no one’s surprise, much of it has to do with technology. Here is part of what he says:

Each decade for the past 60 years, we have seen a thousand-fold increase in world-wide processing power, bandwidth and storage. At the same time, costs have fallen by a factor of 10,000. Advances in these platforms, in themselves, do not produce innovation. But they facilitate the development and deployment of entirely new applications that take advantage of these advances. Amazing new applications are almost never predictable. They come from human creativity. That is one reason they almost never come from incumbent companies. But once barriers to innovation are lowered, new applications follow.
With that as a teaser, you can read the full article here.