Heart or Head?

1023-BKS-RovingEye-blog427

Tim Parks knows how to get the conversation going and the emotions flowing. Here’s how titled his piece in the Roving Eye column in the New York Times Book Review: “Should Novels Aim for the Heart or the Head?”

That will keep you reading. Here is part of what he shared:

And he [Montaigne] asks a question that no one asks these days: “Is it right for the arts to serve our natural weakness and to let them profit from our inborn animal-stupidity?” Aside from its astute selection of moving detail, art is constantly in the business of manipulating our emotions, as if this were an end in itself. This, after all, was Plato’s objection to the arts and every kind of artistic effect — that it was manipulative and potentially mendacious. Or simply a waste: “How often,” Montaigne asks, “do we encumber our spirits with yellow bile or sadness by means of such shadows?”

If we apply these ideas to narrative fiction as it is today, what do we find? First, the idea that a book, or film for that matter, stimulates extreme emotions is constantly deployed as a promotional tool. Terrifying, hair-raising, profoundly upsetting, painfully tender, heartbreaking, devastating, shocking, are all standard fare in dust-jacket blurbs and newspaper reviews; it is as if the reader were an ectoplasm in need of powerful injections of adrenaline. Anything that disturbs us, arouses us, unsettles us, is unconditionally positive. “You will be on the edge of your seat.” “Your heart will be thumping.” “Your pulse will be racing.” Aristotle’s response to Plato, that arousing emotion could be positive so long as the emotion was clarified, cathartically contained and understood, is rarely invoked. At best there is the implication that arousing emotions fosters sympathy, perhaps even empathy, with fictional characters and that such sympathy then breaks down our prejudices and hence is socially useful. So readers will frequently be invited to contemplate the sufferings of threatened minorities or discriminated-against ethnic groups, or the predicament of those who are young, helpless and preferably attractive. But this is an alibi and we all know it; what matters is stimulating emotion to sell books.

Want more? You can read the full piece here.

Shop – Or Not?

17patchett-master768

I suspect our cave-dwelling ancestors had it much easier than we do now. When they got hungry, one (or several) of them went out on the savannah, found a beast worth killing, then brought it back to the cave where it nourished the clan.

Today, that fresh game is found in supermarkets – and even online – so we do what we do, we shop. We shop for everything. And as relative abundance has prevailed in the first-world, we shop for things we need – as well as for things we don’t need.

That’s why I was so intrigued by Ann Patchett’s recent piece in the New York Times, “My Year of No Shopping.” Here’s how she teed it up:

The idea began in February 2009 over lunch with my friend Elissa, someone I like but rarely see. She walked into the restaurant wearing a fitted black coat with a high collar.

“Wow,” I said admiringly. “Some coat.”

She stroked the sleeve. “Yeah. I bought it at the end of my no-shopping year. I still feel a little bad about it.”

Elissa told me the story: After traveling for much of the previous year, she had decided she had enough stuff, or too much stuff. She made a pledge that for 12 months she wouldn’t buy shoes, clothes, purses or jewelry.

I was impressed by her discipline, but she shrugged it off. “It wasn’t hard.”

I did some small-scale experiments of my own, giving up shopping for Lent for a few years. I was always surprised by how much better it made me feel. But it wasn’t until last New Year’s Day that I decided to follow my friend’s example.

Want more? You can read the full article here.

Paula Hawkins

0918-BKS-ByTheBook-blog427

One of the most popular writers today is Paula Hawkins, author of The Girl on the Train. And for many of us, we’re always interested in learning about what great writers read: Some excerpts:

What books are currently on your night stand?

“As If,” by Blake Morrison; “The Underground Railroad,” by Colson Whitehead; Virginia Woolf’s “A Writer’s Diary.” I’m also listening to the audiobook of “A Brief History of Seven Killings,” by Marlon James.

What’s the last great book you read?

“A Little Life,” by Hanya Yanagihara. I came to it rather late — I’d been put off by what I’d heard about the upsetting subject matter, but when I heard Hanya speak about the book at the Sydney Writers’ Festival in May I changed my mind. And I’m so glad I did, because while it was every bit as traumatic as everyone said it would be, it is also a remarkable study of friendship, suffering and the difficulty of recovery. Incidentally it is the first audiobook I have ever listened to, and I’m now a total convert. I’d forgotten what a joyous thing it is to allow yourself to be told a story.

Want more? You can read the full piece here.

Artificial Intelligence

26mag-explicableai-image1-articleLarge

Few technologies have had a big an impact – and promise to have more in the future – than artificial intelligence, or AI.

That’s why it was no surprise that the New York Times Magazine featured an article entitled, “Can A.I. Be Taught to Explain Itself.” For me, it was riveting. Some excerpts:

It has become commonplace to hear that machines, armed with machine learning, can outperform humans at decidedly human tasks, from playing Go to playing “Jeopardy!” We assume that is because computers simply have more data-crunching power than our soggy three-pound brains. Kosinski’s results suggested something stranger: that artificial intelligences often excel by developing whole new ways of seeing, or even thinking, that are inscrutable to us. It’s a more profound version of what’s often called the “black box” problem — the inability to discern exactly what machines are doing when they’re teaching themselves novel skills — and it has become a central concern in artificial-intelligence research. In many arenas, A.I. methods have advanced with startling speed; deep neural networks can now detect certain kinds of cancer as accurately as a human. But human doctors still have to make the decisions — and they won’t trust an A.I. unless it can explain itself.

“Artificial intelligence” is a misnomer, an airy and evocative term that can be shaded with whatever notions we might have about what “intelligence” is in the first place. Researchers today prefer the term “machine learning,” which better describes what makes such algorithms powerful.

The idea was to connect leading A.I. researchers with experts in data visualization and human-computer interaction to see what new tools they might invent to find patterns in huge sets of data. There to judge the ideas, and act as hypothetical users, were analysts for the C.I.A., the N.S.A. and sundry other American intelligence agencies.

Even if a machine made perfect decisions, a human would still have to take responsibility for them — and if the machine’s rationale was beyond reckoning, that could never happen.

Intrigued? You can read the full article here

Time to Party…Or?

14lachman-master768

By almost any measure, the U.S. and the world economy are booming. We seem to have moved well-beyond the 2008 recession and are moving forward on all cylinders.

That’s why I found Desmond Lachman’s New York Times article, “The Global Economy Is Partying Like It’s 2008,” so intriguing. He wonders if we’re in another bubble. He begins like this:

Certainly, the American economy is doing well, and emerging economies are picking up steam. But global asset prices are once again rising rapidly above their underlying value — in other words, they are in a bubble. Considering the virtual silence among economists about the danger they pose, one has to wonder whether in a year or two, when those bubbles eventually burst.

This silence is all the more surprising considering how much more pervasive bubbles are today than they were 10 years ago. While in 2008 bubbles were largely confined to the American housing and credit markets, they are now to be found in almost every corner of the world economy.

 

Want more? You can read the full piece here.

Best Books

08BOOKSOFTHEYEAR1-blog427-v3

One thing many of us look forward to on Sundays is to read the New York Times Book Review section. Here, outstanding writers offer their opinions on new books. They never seem to miss.

For much the same reason, we look forward to the Times year-end “best books” list. For 2017, the list is especially rich. Here is how the article begins:

This was a year when books — like the rest of us — tried to keep up with the news, and did a pretty good job of it. Novels about global interconnectedness, political violence and migration; deeply reported nonfiction accounts of racial and economic strife in the United States; stories both imagined and real about gender, desire and the role of beauty in the natural world. There were several worthy works of escapism, of course, but the literary world mostly reflected the gravity and tumult of the larger world. Below, The New York Times’s three daily book critics — Dwight Garner, Jennifer Senior and Parul Sehgal — share their thoughts about their favorites among the books they reviewed this year, each list alphabetical by author. Janet Maslin, a former staff critic who remains a frequent contributor to The Times, also lists her favorites.

Want more? You can read the full article here.

Tech Rising

What do technology and architecture have in common? Your first reaction might be, “not much,” but a closer look at what is happening to the San Francisco skyline might change your mind.

David Streitfeld’s recent piece, “San Francisco’s Skyline, Now Inescapably Transformed by Tech,” features the subtitle: “Salesforce Tower, which at 1,070 feet is the tallest office building west of the Mississippi, will be inhabited in January, signaling tech’s triumph in the city.”

This short piece in The Sunday New York Times Business Section, marks not just an association, but a marriage, between technology and architecture

Streitfeld notes that in Silicon Valley, the office parks blend into the landscape. They might have made their workers exceedingly rich, they might have changed the world — whether for better or worse is currently up for debate — but there is nothing about them that says: We are a big deal.

Skyscrapers tell a different story. They are the pyramids of our civilization, permanent monuments of our existence. They show who is in charge and what they think about themselves. Salesforce Tower is breaking a San Francisco height record that stood for nearly half a century.

Intrigued? You can read the full article here.

Stories

25leonhardtWeb-master768

Ever since our cavemen ancestors drew pictures of their successful hunt (it had to be successful, or they wouldn’t have returned to talk about it), Homo sapiens have been enticed by stories.

And while we’ve known that over past generations, today, that notion is under attack by the idea of making “informed decisions” supported by DATA.

But now there is push-back, and that’s why I found David Leonhardt’s recent piece, “What I Was Wrong About This Year,” so refreshing. It puts an explanation point on the need for STORY. Here is how he begins:

The Israeli intelligence service asked the great psychologist Daniel Kahneman for help in the 1970s, and Kahneman came back with a suggestion: Get rid of the classic intelligence report. It allows leaders to justify any conclusion they want, Kahneman said. In its place, he suggested giving the leaders estimated probabilities of events.

The intelligence service did so, and an early report concluded that one scenario would increase the chance of full-scale war with Syria by 10 percent. Seeing the number, a top official was relieved. “Ten percent increase?” he said. “That is a small difference.”

Kahneman was horrified (as Michael Lewis recounts in his book “The Undoing Project”). A 10 percent increase in the chance of catastrophic war was serious. Yet the official decided that 10 wasn’t so different from zero.

Looking back years later, Kahneman said: “No one ever made a decision because of a number. They need a story.”

Want more? You can read the full article here.

Law of Innovation

BN-WH159_KEYWOR_P_20171123163122

Most businesses are “all about innovation.” We made innovation a buzz word, but few really have done a deep dive into what innovation means, especially in business.

While such a broad term defines simple explanation – and can mean many thinks to many people, I found Christopher Mims “Laws of Innovation” piece in the Wall Street Journal helpful in bounding the challenge. Here are his “laws.”

Three decades ago, a historian wrote six laws to explain society’s unease with the power and pervasiveness of technology. Though based on historical examples taken from the Cold War, the laws read as a cheat sheet for explaining our era of Facebook, Google, the iPhone and FOMO.

You’ve probably never heard of these principles or their author, Melvin Kranzberg, a professor of the history of technology at Georgia Institute of Technology who died in 1995.

What’s a bigger shame is that most of the innovators today, who are building the services and tools that have upended society, don’t know them, either.

Fortunately, the laws have been passed down by a small group of technologists who say they have profoundly impacted their thinking. The text should serve as a foundation—something like a Hippocratic oath—for all people who build things.

  1. ‘Technology is neither good nor bad; nor is it neutral’
  2. ‘Invention is the mother of necessity.’
  3. ‘Technology comes in packages, big and small.
  4. ‘Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.’
  5. ‘All history is relevant, but the history of technology is the most relevant.’
  6. ‘Technology is a very human activity.’

Want more? You can read the full piece here

Lost Wars

16Bacevich-master1050

The terrorist attacks on September 11, 2001 profoundly change America’s national security equation – perhaps forever.

Those attacks spawned the wars in Afghanistan and Iraq, and these have all-but-consumed the U.S. military for more than a decade-and-a-half.

There have been several books – some good and some less so – that have tried to help us come to grips with not only why we embarked upon these wars, as well as why we can’t “win.”

Andrew Bacevich’s review of Daniel Bolger’s book, “Why We Lost,” offers some key insights. Here is how he begins:

The author of this book has a lot to answer for. “I am a United States Army general,” Daniel Bolger writes, “and I lost the Global War on Terrorism.” The fault is not his alone, of course. Bolger’s peers offered plenty of help. As he sees it, in both Afghanistan and Iraq, abysmal generalship pretty much doomed American efforts.

The judgment that those wars qualify as lost — loss defined as failing to achieve stated objectives — is surely correct. On that score, Bolger’s honesty is refreshing, even if his explanation for that failure falls short. In measured doses, self-flagellation cleanses and clarifies. But heaping all the blame on America’s generals lets too many others off the hook.

Why exactly did American military leaders get so much so wrong? Bolger floats several answers to that question but settles on this one: With American forces designed for short, decisive campaigns, the challenges posed by protracted irregular warfare caught senior officers completely by surprise.

Since there aren’t enough soldiers — having “outsourced defense to the willing,” the American people stay on the sidelines — the generals asked for more time and more money. This meant sending the same troops back again and again, perhaps a bit better equipped than the last time. With stubbornness supplanting purpose, the military persisted, “in the vain hope that something might somehow improve.

Want more? You can read the full article here