Sentences First

08Dillon-articleLarge

Most of us who write are always looking for writing tips. I found some good ones in a recent book review with an intriguing title: “Nailing the Jelly of Reality to the Wall.”

The book the writer reviews is, “FIRST YOU WRITE A SENTENCE: The Elements of Reading, Writing … and Life,” by Joe Moran. Here’s how he begins:

A well-formed sentence, Joe Moran writes in his humane and witty guide to meaning-making, “is a cure, however fleeting, for human loneliness.” We all write more sentences now than ever, but how hard do we think about the shape of these etheric objects? A good sentence is a considerate gift; or maybe it’s an easeful, mapless walk with your reader, through a new city — but it might also be a high-wire act (audience agog for disaster). Moran’s book contains many such metaphors for the sentence, and at least one for figurative language itself: “Metaphor is how we nail the jelly of reality to the wall.” Is the sentence a transaction, or is it an artifact? Polished performance or open invitation? “First You Write a Sentence” is a “muted love letter” to the form, arguing in its genially opinionated way for sentences that make our lives more democratic and more pleasurable.

At the calm heart of Moran’s rhetorically affable book is an idea of adroit aplomb. He thinks a sentence should slide down the gullet like a clam, hardly touching the sides. His own prose is much like this. Unlike many writers on style, he doesn’t get carried away with examples; those he provides tend to be by masters of the almost invisible art of elegantly simple diversion. The mind and ear enjoy, but don’t get snagged on, the language of William Tyndale’s English Bible, Thomas Merton’s essays, the recipes of Elizabeth David. The sentences Moran likes derive from the loose, Senecan style perfected in the 17th century by the likes of John Donne, rather than ones from the stiff, hierarchical period of Samuel Johnson a century later. The best modern sentences resemble Donne’s, with simple statements upfront, then a pileup, if need be, of clause upon appositive clause, clarifying, elaborating, potentially without cease — but casually, too, always ready to end.

Want more? You can read the rest of the piece here

Cyber-War

merlin_139802226_c506512f-1429-494f-9502-a1f89f253517-jumbo

One of the most cutting-edge military technologies is generally called “cyber.” Most people struggle with this concept and with what “cyber-warfare” actually means.

That’s why I was intrigued by a recent book review of David Sanger’s book: “THE PERFECT WEAPON: War, Sabotage, and Fear in the Cyber Age.” Here’s how the reviewer begins:

New technologies of destruction have appeared throughout history, from the trireme and gunpowder in past centuries to biological and nuclear weapons in more modern times. Each technology goes through a cycle of development and weaponization, followed only later by the formulation of doctrine and occasionally by efforts to control the weapon’s use. The newest technological means of mayhem are cyber, meaning anything involving the electronic transmission of ones and zeros. The development of cyber capabilities has been rapid and is continuing; doctrine is largely yet to be written; and ideas about control are only beginning to emerge.

David E. Sanger’s “The Perfect Weapon” is an encyclopedic account of policy-relevant happenings in the cyberworld. Sanger, a national security correspondent for The New York Times, stays firmly grounded in real events, including communication systems getting hacked and servers being disabled. He avoids the tendency, all too common in futuristic discussions of cyber issues, to spin out elaborate and scary hypothetical scenarios. The book flows from reporting for The Times by Sanger and his colleagues, who have had access, and volunteer informants, that lesser publications rarely enjoy. The text frequently shifts to the first-person singular, along with excerpts from interviews Sanger has had with officials up to and including the president of the United States.

The principal focus of the book is cyberwarfare — the use of techniques to sabotage the electronic or physical assets of an adversary — but its scope extends as well to other controversies that flow from advances in information technology. Sanger touches on privacy issues related to the collection of signals intelligence — a business that has been around since before Franklin Roosevelt’s secretary of war, Henry Stimson, talked about gentlemen not reading each other’s mail. He also addresses social media and the problems of misuse that have bedeviled Facebook, including usage by foreign governments for political purposes. These other topics are to some extent a digression from the main topic of cyberwarfare. Intelligence collection and electronic sabotage are different phenomena, which in the United States involve very different legal principles and policy procedures. But Sanger takes note of such differences, and the book’s inclusiveness makes it useful as a one-stop reference for citizens who want to think intelligently about all issues of public policy having a cyber dimension.

You can read the full review here

Do It Now?

shutterstock_1446688247

We all want to get things done, right? The easy answer is, “yes, of course.” We all do what needs doing right away, right? Hmmmm…maybe not such a good answer.

We all procrastinate. I know I do, but until I read an article, “Why You Procrastinate and How to Break the Habit,” I didn’t know WHY I did. Now I do. Here’s how the article begins:

If you’ve ever put off an important task by, say, alphabetizing your spice drawer, you know it wouldn’t be fair to describe yourself as lazy.

After all, alphabetizing requires focus and effort — and hey, maybe you even went the extra mile to wipe down each bottle before putting it back. And it’s not like you’re hanging out with friends or watching Netflix. You’re cleaning — something your parents would be proud of! This isn’t laziness or bad time management. This is procrastination.

If procrastination isn’t about laziness, then what is it about?

Etymologically, “procrastination” is derived from the Latin verb procrastinare — to put off until tomorrow. But it’s more than just voluntarily delaying. Procrastination is also derived from the ancient Greek word akrasia — doing something against our better judgment.

“It’s self-harm,” said Dr. Piers Steel, a professor of motivational psychology at the University of Calgary and the author of “The Procrastination Equation: How to Stop Putting Things Off and Start Getting Stuff Done.

That self-awareness is a key part of why procrastinating makes us feel so rotten. When we procrastinate, we’re not only aware that we’re avoiding the task in question, but also that doing so is probably a bad idea. And yet, we do it anyway.

“This is why we say that procrastination is essentially irrational,” said Dr. Fuschia Sirois, professor of psychology at the University of Sheffield. “It doesn’t make sense to do something you know is going to have negative consequences.”

She added: “People engage in this irrational cycle of chronic procrastination because of an inability to manage negative moods around a task.”

Want more? You can read the full article here

Trusting AI

06MarcusDavis-superJumbo

Few technologies inspire more controversy than artificial intelligence. Some hail it as a savior, others predict it will spell our doom.

That’s why I was drawn to a recent op-ed, “Build AI we can trust.” Here’s how the two writers begin:

Artificial intelligence has a trust problem. We are relying on A.I. more and more, but it hasn’t yet earned our confidence.

Tesla cars driving in Autopilot mode, for example, have a troubling history of crashing into stopped vehicles. Amazon’s facial recognition system works great much of the time, but when asked to compare the faces of all 535 members of Congress with 25,000 public arrest photos, it found 28 matches, when in reality there were none. A computer program designed to vet job applicants for Amazon was discovered to systematically discriminate against women. Every month new weaknesses in A.I. are uncovered.

The problem is not that today’s A.I. needs to get better at what it does. The problem is that today’s A.I. needs to try to do something completely different.

In particular, we need to stop building computer systems that merely get better and better at detecting statistical patterns in data sets — often using an approach known as deep learning — and start building computer systems that from the moment of their assembly innately grasp three basic concepts: time, space and causality.

Today’s A.I. systems know surprisingly little about any of these concepts. Take the idea of time. We recently searched on Google for “Did George Washington own a computer?” — a query whose answer requires relating two basic facts (when Washington lived, when the computer was invented) in a single temporal framework. None of Google’s first 10 search results gave the correct answer. The results didn’t even really address the question. The highest-ranked link was to a news story in The Guardian about a computerized portrait of Martha Washington as she might have looked as a young woman.

Check out this link to read more

Leading Technology

im-103755

Much ink has been spilled regarding the challenges the United States faces in our military technology race with potential adversaries like China and Russia.

One of the best analysts regarding this issue is Mackenzie Eaglen. Here is part of what she said in a receipt op-ed:

In the global arms race, a moment’s hesitation is enough to lose your lead. The Pentagon pioneered research 15 years ago into hypersonic missiles that can cruise at Mach 5. The U.S. then chose not to develop the technology—but China and Russia developed it. Now Beijing and Moscow have hypersonics at the ready and, according to Pentagon research chief Michael D. Griffin, no number of current U.S. ships or ground-based antimissile systems would be enough to counter a massive attack.

The problem stems in part from the Pentagon’s increasing dependence on outside firms. For decades after World War II, the Defense Department was a producer of cutting-edge research and technology, but today it contracts more and more out to Silicon Valley. No longer setting its own course for development, the Pentagon is unable to take the major leaps that once kept U.S. military technology racing ahead.

The Pentagon still acquires its systems in accordance with decades-old protocols that value compliance over nimbleness and usefulness. It has doubled down on unreasonable demands to own intellectual property in perpetuity, a nonstarter for many software companies with which it contracts. Now defense leaders are stuck having to sort out which software systems might pose a security risk because the developers often also sell to America’s rivals.

This shift from calling the shots to negotiating with ever-more-private interests is new for the defense bureaucracy. For generations, influence flowed in the other direction. The buildup in defense research-and-development spending that began in the late 1940s and continued through the ’80s was responsible for propelling many of the tech breakthroughs of the past century: cellphones, jet engines, integrated circuits, weather satellites and the Global Positioning System. A recent example is Apple ’s Siri artificial-intelligence system, which it purchased from the Defense Advanced Research Projects Agency.

You can read the full article here

Happiness

28friedman-superJumbo

Who doesn’t want to be happy? Even those of us who count ourselves as generally happy seem to always be looking for more.

That’s why I was drawn to a piece by Richard Friedman, “A Swimmer’s Guide to Happiness.” Here is part of what he shares:

Research shows that thinking too much about how to be happy actually backfires and undermines well-being. This is in part because all that thinking consumes a fair amount of time, and is not itself enjoyable.

The researchers behind this study, called “Vanishing Time in the Pursuit of Happiness,” randomly assigned subjects to one of two tasks: One group was asked to write down 10 things that could make them become happier, while the other wrote 10 things that demonstrated that they were already happy.

The subjects were then asked to what extent they felt time was slipping away and how happy they felt at that moment. Those prompted to think about how they could become happier felt more pressed for time and significantly less happy.

This jibes with the argument the journalist Ruth Whippman makes in her 2016 book “America the Anxious: How Our Pursuit of Happiness Is Creating a Nation of Nervous Wrecks.” Trying too hard to be happy — downloading mindfulness apps, taking yoga classes, reading self-help books — mostly just stresses us out, she writes. So what should we do instead? Maybe simply hang out with some friends, doing something we like to do together: “Study after study shows that good social relationships are the strongest, most consistent predictor there is of a happy life.”

Want more? You can read the rest of the piece here

Who Brings Us AI?

00aisweatshop1-articleLarge-v2

When someone mentions artificial intelligence – AI – we typically think of some Silicon Valley tech titan dressed in jeans and an ever-so-sheik sport coat.

But as they say, that’s just the tip of the iceberg. Few of us understand how enormous troves of data needed to have AI gets assembled and crunched.

Cade Metz helps us understand the unseen underbelly of the tech industry. It’s a revealing – and troubling – look at the cost of doing business to get that next cool app. Here’s how she begins:

BHUBANESWAR, India — Namita Pradhan sat at a desk in downtown Bhubaneswar, India, about 40 miles from the Bay of Bengal, staring at a video recorded in a hospital on the other side of the world.

The video showed the inside of someone’s colon. Ms. Pradhan was looking for polyps, small growths in the large intestine that could lead to cancer. When she found one — they look a bit like a slimy, angry pimple — she marked it with her computer mouse and keyboard, drawing a digital circle around the tiny bulge.

She was not trained as a doctor, but she was helping to teach an artificial intelligence system that could eventually do the work of a doctor.

Ms. Pradhan was one of dozens of young Indian women and men lined up at desks on the fourth floor of a small office building. They were trained to annotate all kinds of digital images, pinpointing everything from stop signs and pedestrians in street scenes to factories and oil tankers in satellite photos.

A.I., most people in the tech industry would tell you, is the future of their industry, and it is improving fast thanks to something called machine learning. But tech executives rarely discuss the labor-intensive process that goes into its creation. A.I. is learning from humans. Lots and lots of humans.

Want more? You can read the full article here

A Novelist for the Ages

11Douthat-superJumbo

This summer we lost one of the icons of American literature, Toni Morrison. Lovely obits have been carried in national – and international – media.

But it was Ross Douthat’s recent op-ed that caught my eye. He explained the enormous impact she had on all of us. Here’s how he began:

Toni Morrison was a great American novelist who was also a Great American Novelist. This means she had a special form of celebrity, an oracular status, and also that she was embraced by the tradition that regards novels as keys to interpreting America — insisting that you must read Morrison (and Ellison and Wright and Hurston) to understand the black experience, just as you must read Hawthorne and Melville to understand the legacy of Puritanism, or Faulkner or Cather to understand the South or West, and so on down the high-school English list.

So her passing raises the question: Is she the last of the species? The last American novelist who made novels seem essential to an educated person’s understanding of her country?

That question won’t be answerable for decades — the time it took to exhume, for instance, “Moby-Dick” and “The Great Gatsby” from their temporary graves. We can’t know how Morrison’s reputation will change, or the reputations of her peers or the status of their art form. The American novel was supposed to be eclipsed long ago by movies and television … and yet it proved resilient enough that, coming of age long after TV, I was still imprinted with the idea that novels were essential cultural ground, as important as Spielberg or “The Sopranos.”

But something has changed in the cultural status of the novel in the time I’ve been a reader, the years between Morrison’s canonization and her passing — and maybe especially the years since social media and the iPhone first arrived.

Check out this link to read more

Harnessing Technology to Make War Safer

28kunce-superJumbo

Few subjects inspire more furious debate than the terms “war” and “drones.” There is vastly more heat than light on this subject.

That is why I was impressed by the reasoned arguments of a U.S. Marine who wrote an article entitled, “How Tech Can Make War Safer.”

Amidst all the shrill debate on the subject, Lucas Kunce explained how when the tech industry refuses to work on defense-related projects, war becomes less safe Here’s how he begins:

Last year, more than 4,600 Google employees signed a petition urging the company to commit to refusing to build weapons technology. A response to Google’s work with the military on an artificial intelligence-based targeting system, the petition made a powerful and seemingly simple moral statement: “We believe that Google should not be in the business of war.” Similarly, Microsoft employees in February demanded that their company withhold its augmented reality HoloLens headset technology from the Army, saying they did not want to become “war profiteers.”

As a Marine who has been in harm’s way a few times, I am glad that my peers in the tech industry have initiated this discussion. America is long overdue for a conversation about how we engage in war and peace; the difference between the decision to go to war and decisions about what happens on the battlefield during warfare; and what it means to fight, die and kill for our country.

My job has put me in places where I have witnessed and taken part in significant battlefield decisions. From my experience, I have learned that working with the military to develop systems would actually support the tech workers’ goal to reduce harm in warfare. (I need to note here that I am speaking for myself, and my views do not necessarily reflect those of the Department of Defense.)

Tech workers might not realize that their opposition to the work their companies do on military technology does not change the decision-making of the American leaders who choose to go to war, and therefore is unlikely to prevent any harm caused by war. Instead, it has the unintended effect of imperiling not only the lives of service members, but also the lives of innocent civilians whom I believe these workers want to protect.

You can read the full article here

Charisma

00sl_charisma-superJumbo

How many times have you heard someone say: “He (or she) has charisma.” Certain people seem to have it, while most of us think we don’t.

Not to put too fine a point on it, but all of us need at least a little bit of charisma. It’s how we influence people and get along in the world.

That’s why I found this article, “Becoming Charismatic, One Step at a Time,” so fascinating. Here’s how it begins:

Ask people to name someone they find charming and the answers are often predictable. There’s James Bond, the fictional spy with a penchant for shaken martinis. Maybe they’ll mention Oprah Winfrey, Bill Clinton or a historical figure, like the Rev. Dr. Martin Luther King Jr. or Mahatma Gandhi. Now ask the same people to describe, in just a few seconds, what makes these charmers so likable.

It’s here, in defining what exactly charisma is, that most hit a wall. Instinctually, we know that we’re drawn to certain people more than others. Quantifying why we like them is an entirely different exercise.

The ancient Greeks described charisma as a “gift of grace,” an apt descriptor if you believe likability is a God-given trait that comes naturally to some but not others. The truth is that charisma is a learned behavior, a skill to be developed in much the same way that we learned to walk or practice vocabulary when studying a new language. Other desirable traits, like wealth or appearance, are undoubtedly linked to likability, but being born without either doesn’t preclude you from being charismatic.

For all the work put into quantifying charisma — and it’s been studied by experts through the ages, including Plato and those we talked to for this piece — there are still a lot of unknowns. There are, however, two undisputed truths.

The first is that we are almost supernaturally drawn to some people, particularly those we like. Though this is not always the case; we can just as easily be drawn in by a charismatic villain.

The second truth is that we are terrible at putting a finger on what it is that makes these people so captivating. Beyond surface-level observations — a nice smile, or the ability to tell a good story — few of us can quantify, in an instant, what makes charismatic people so magnetic.

Want more? You can read them here