Tech Rising

What do technology and architecture have in common? Your first reaction might be, “not much,” but a closer look at what is happening to the San Francisco skyline might change your mind.

David Streitfeld’s recent piece, “San Francisco’s Skyline, Now Inescapably Transformed by Tech,” features the subtitle: “Salesforce Tower, which at 1,070 feet is the tallest office building west of the Mississippi, will be inhabited in January, signaling tech’s triumph in the city.”

This short piece in The Sunday New York Times Business Section, marks not just an association, but a marriage, between technology and architecture

Streitfeld notes that in Silicon Valley, the office parks blend into the landscape. They might have made their workers exceedingly rich, they might have changed the world — whether for better or worse is currently up for debate — but there is nothing about them that says: We are a big deal.

Skyscrapers tell a different story. They are the pyramids of our civilization, permanent monuments of our existence. They show who is in charge and what they think about themselves. Salesforce Tower is breaking a San Francisco height record that stood for nearly half a century.

Intrigued? You can read the full article here.

Stories

25leonhardtWeb-master768

Ever since our cavemen ancestors drew pictures of their successful hunt (it had to be successful, or they wouldn’t have returned to talk about it), Homo sapiens have been enticed by stories.

And while we’ve known that over past generations, today, that notion is under attack by the idea of making “informed decisions” supported by DATA.

But now there is push-back, and that’s why I found David Leonhardt’s recent piece, “What I Was Wrong About This Year,” so refreshing. It puts an explanation point on the need for STORY. Here is how he begins:

The Israeli intelligence service asked the great psychologist Daniel Kahneman for help in the 1970s, and Kahneman came back with a suggestion: Get rid of the classic intelligence report. It allows leaders to justify any conclusion they want, Kahneman said. In its place, he suggested giving the leaders estimated probabilities of events.

The intelligence service did so, and an early report concluded that one scenario would increase the chance of full-scale war with Syria by 10 percent. Seeing the number, a top official was relieved. “Ten percent increase?” he said. “That is a small difference.”

Kahneman was horrified (as Michael Lewis recounts in his book “The Undoing Project”). A 10 percent increase in the chance of catastrophic war was serious. Yet the official decided that 10 wasn’t so different from zero.

Looking back years later, Kahneman said: “No one ever made a decision because of a number. They need a story.”

Want more? You can read the full article here.

Law of Innovation

BN-WH159_KEYWOR_P_20171123163122

Most businesses are “all about innovation.” We made innovation a buzz word, but few really have done a deep dive into what innovation means, especially in business.

While such a broad term defines simple explanation – and can mean many thinks to many people, I found Christopher Mims “Laws of Innovation” piece in the Wall Street Journal helpful in bounding the challenge. Here are his “laws.”

Three decades ago, a historian wrote six laws to explain society’s unease with the power and pervasiveness of technology. Though based on historical examples taken from the Cold War, the laws read as a cheat sheet for explaining our era of Facebook, Google, the iPhone and FOMO.

You’ve probably never heard of these principles or their author, Melvin Kranzberg, a professor of the history of technology at Georgia Institute of Technology who died in 1995.

What’s a bigger shame is that most of the innovators today, who are building the services and tools that have upended society, don’t know them, either.

Fortunately, the laws have been passed down by a small group of technologists who say they have profoundly impacted their thinking. The text should serve as a foundation—something like a Hippocratic oath—for all people who build things.

  1. ‘Technology is neither good nor bad; nor is it neutral’
  2. ‘Invention is the mother of necessity.’
  3. ‘Technology comes in packages, big and small.
  4. ‘Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.’
  5. ‘All history is relevant, but the history of technology is the most relevant.’
  6. ‘Technology is a very human activity.’

Want more? You can read the full piece here

Lost Wars

16Bacevich-master1050

The terrorist attacks on September 11, 2001 profoundly change America’s national security equation – perhaps forever.

Those attacks spawned the wars in Afghanistan and Iraq, and these have all-but-consumed the U.S. military for more than a decade-and-a-half.

There have been several books – some good and some less so – that have tried to help us come to grips with not only why we embarked upon these wars, as well as why we can’t “win.”

Andrew Bacevich’s review of Daniel Bolger’s book, “Why We Lost,” offers some key insights. Here is how he begins:

The author of this book has a lot to answer for. “I am a United States Army general,” Daniel Bolger writes, “and I lost the Global War on Terrorism.” The fault is not his alone, of course. Bolger’s peers offered plenty of help. As he sees it, in both Afghanistan and Iraq, abysmal generalship pretty much doomed American efforts.

The judgment that those wars qualify as lost — loss defined as failing to achieve stated objectives — is surely correct. On that score, Bolger’s honesty is refreshing, even if his explanation for that failure falls short. In measured doses, self-flagellation cleanses and clarifies. But heaping all the blame on America’s generals lets too many others off the hook.

Why exactly did American military leaders get so much so wrong? Bolger floats several answers to that question but settles on this one: With American forces designed for short, decisive campaigns, the challenges posed by protracted irregular warfare caught senior officers completely by surprise.

Since there aren’t enough soldiers — having “outsourced defense to the willing,” the American people stay on the sidelines — the generals asked for more time and more money. This meant sending the same troops back again and again, perhaps a bit better equipped than the last time. With stubbornness supplanting purpose, the military persisted, “in the vain hope that something might somehow improve.

Want more? You can read the full article here

AI and You!

merlin_129555845_9af63388-6b04-422c-a297-f879e0d7287d-master768

Few subjects have captured the public’s imagination today more than artificial intelligence (AI) and machine learning. A niche, tech subject just a few years ago, AI has now gone mainstream.

Part of this is because we are surrounded by digital aps like Siri and Cortana inform and entertain us daily (just ask Siri “What is zero divided by zero).

But AI will play a much more profound role in our lives in the future. But we may have to wait for it. Here is part of what Steve Lohr shared recently in a New York Times piece:

There are basically three big questions about artificial intelligence and its impact on the economy: What can it do? Where is it headed? And how fast will it spread?

Three new reports combine to suggest these answers: It can probably do less right now than you think. But it will eventually do more than you probably think, in more places than you probably think, and will probably evolve faster than powerful technologies have in the past.

This bundle of research is itself a sign of the A.I. boom. Researchers across disciplines are scrambling to understand the likely trajectory, reach and influence of the technology — already finding its way into things like self-driving cars and image recognition online — in all its dimensions. Doing so raises a host of challenges of definition and measurement, because the field is moving quickly — and because companies are branding things A.I. for marketing purposes.

An “AI Index,” created by researchers at Stanford University, the Massachusetts Institute of Technology and other organizations, released on Thursday, tracks developments in artificial intelligence by measuring aspects like technical progress, investment, research citations and university enrollments. The goal of the project is to collect, curate and continually update data to better inform scientists, businesspeople, policymakers and the public.

Want more? You can read the full article here

Thinking Well?

jdc_b457

As human beings, we pride ourselves on being rationale…after all…we’re not lemmings running off the end of a cliff…right?

I thought we were, that is, until I read a short op-ed by David Brooks. Here is part of what he said about how rationale we are:

Richard Thaler has just won an extremely well deserved Nobel Prize in economics. Thaler took an obvious point, that people don’t always behave rationally, and showed the ways we are systematically irrational.

Thanks to his work and others’, we know a lot more about the biases and anomalies that distort our perception and thinking, like the endowment effect (once you own something you value it more than before you owned it), mental accounting (you think about a dollar in your pocket differently than you think about a dollar in the bank) and all the rest.

It’s when we get to the social world that things really get gnarly. A lot of our thinking is for bonding, not truth-seeking, so most of us are quite willing to think or say anything that will help us be liked by our group. We’re quite willing to disparage anyone when, as Marilynne Robinson once put it, “the reward is the pleasure of sharing an attitude one knows is socially approved.” And when we don’t really know a subject well enough, in T. S. Eliot’s words, “we tend always to substitute emotions for thoughts,” and go with whatever idea makes us feel popular.

Want more? You can read the full article here

The Great War?

06Kazin-master768

For most Americans today, World War I is something that is consigned to history books. We learned that the United States entered the war reluctantly, but that we fought the good fight. We also get the notion that one of the results of the war was that America became a great power – and became greater during the 20th Century.

That’s why I found Michael Kazin’s New York Times piece, “The Great Mistake in the Great War,” so interesting. Here is how he began:

One hundred years ago, Congress voted to enter what was then the largest and bloodiest war in history. Four days earlier, President Woodrow Wilson had sought to unite a sharply divided populace with a stirring claim that the nation “is privileged to spend her blood and her might for the principles that gave her birth and happiness and the peace which she has treasured.” The war lasted only another year and a half, but in that time, an astounding 117,000 American soldiers were killed and 202,000 wounded.

Still, most Americans know little about why the United States fought in World War I, or why it mattered. The “Great War” that tore apart Europe and the Middle East and took the lives of over 17 million people worldwide lacks the high drama and moral gravity of the Civil War and World War II, in which the very survival of the nation seemed at stake.

World War I is less easy to explain. America intervened nearly three years after it began, and the “doughboys,” as our troops were called, engaged in serious combat for only a few months. More Americans in uniform died away from the battlefield — thousands from the Spanish flu — than with weapons in hand. After victory was achieved, Wilson’s audacious hope of making a peace that would advance democracy and national self-determination blew up in his face when the Senate refused to ratify the treaty he had signed at the Palace of Versailles.

But attention should be paid. America’s decision to join the Allies was a turning point in world history. It altered the fortunes of the war and the course of the 20th century — and not necessarily for the better. Its entry most likely foreclosed the possibility of a negotiated peace among belligerent powers that were exhausted from years mired in trench warfare.

Intrigued? You can read the entire article here

The Writing Process

1112-BKS-Kummer-blog427-v2

There are a few writers who help define what writing is for all of us. John McPhee is one of them. That’s why I was intrigued by a review of his newest book: “Draft NO. 4.” Here is part of what the reviewer had to offer:

Followers of John McPhee, perhaps the most revered nonfiction narrative journalist of our time, will luxuriate in the shipshape prose of “Draft No. 4: On the Writing Process,” a collection of eight essays that first appeared in The New Yorker, his home for more than 50 years. Writers looking for the secrets of his stripped-bark style and painstaking structure will have to be patient with what is a discursive, though often delightful, short book. McPhee’s publisher is presenting it as a “master class,” but it’s really a memoir of writing during a time of editorial cosseting that now seems as remote as the court of the Romanovs. Readerly patience will be rewarded by plentiful examples of the author’s sinewy prose and, toward the end, by advice and tips that will help writers looking to become better practitioners of the craft and to stay afloat in what has become a self-service economy.

Virtually no part of McPhee’s long career, full of months-long or years-long research trips and hours or days staring at a blank computer screen, resembles the churn-it-out grind of today’s professional web writer. Except the earliest part, which he returns to often: the English class at Princeton High School whose teacher, Mrs. McKee, made him write three pieces a week (“Not every single week. Some weeks had Thanksgiving in them”) for three solid years and encouraged her students to critique one another, to the point of hissing and spitballs. Her constant deadlines led him to devise a crucial tactic: Force yourself to break from “wallowing in all those notes” and determine an ending, then go back to worrying about the beginning. Which leads to the first formal rule he provides, and then only a quarter of the way through the book: When you’re getting nowhere and “you don’t know what to do. Stop everything. Stop looking at the notes. Hunt through your mind for a good beginning. Then write it. Write a lead.”

Want more? You can read the full article here

Challenges

earth

Victor Davis Hanson is a force of nature. Recently, he commented on the state of our nation and the challenges we’ve built for ourselves. Here’s how he began:

Our Baby Boomer elites, mired in excess and safe in their enclaves, have overseen the decay of our core cultural institutions.

Since the Trojan War, generations have always trashed their own age in comparison to ages past. The idea of fated decadence and decline was a specialty of 19th-century German philosophy.

So we have to be careful in calibrating generations, especially when our own has reached a level of technology and science never before dreamed of (and it is not a given that material or ethical progress is always linear).

Nonetheless, the so-called Baby Boomers have a lot to account for — given the sorry state of entertainment, sports, the media, and universities.

The Harvey Weinstein episode revealed two generational truths about Hollywood culture.

One, the generation that gave us the free-love and the anything-goes morals of Woodstock discovered that hook-up sex was “contrary to nature.” Sexual congress anywhere, any time, anyhow, with anyone — near strangers included — is not really liberating and can often be deeply imbedded within harassment and ultimately the male degradation of women.

Somehow a demented Harvey Weinstein got into his head that the fantasy women in his movies who were customarily portrayed as edgy temptresses and promiscuous sirens were reflections of the way women really were in Los Angeles and New York — or the way that he thought they should be. It was almost as if Weinstein sought to become as physically repulsive and uncouth as possible — all the better to humiliate (through beauty-and-the-beast asymmetry) the vulnerable and attractive women he coerced.

Want more? You can read the full piece here

Intellectual Property

30fruitninja1-superJumbo

There was a time when movies were based on either a book (typically a very good book) or an original screenplay (typically by a great screenwriter). That was then, this is now.

I’d always had the notion that something was changing, but Alex French’s article in the New York Times magazine, “How to Make a Movie Out of Anything — Even a Mindless Phone Game,” so revealing – and so frightening. Here how he began:

In 2013 a movie producer named Tripp Vinson was thumbing through Variety when he stumbled upon a confounding item: Phil Lord and Christopher Miller, a pair of writers and directors, were working on something called ‘‘The Lego Movie.’’ Vinson was baffled. ‘‘I had no idea where they were going to go with Legos,’’ he says. ‘‘There’s no character; no narrative; no theme. Nothing.’’

Since Vinson got into the business, something has changed in Hollywood. More and more movies are developed from intellectual property: already existing stories or universes or characters that have a built-in fan base. Vinson thinks it started in 2007, when the Writers Guild went on strike. ‘‘Before the strike, the studios were each making 20-­something movies a year,’’ he says. ‘‘Back then, you could get a thriller made. After the strike, they cut back dramatically on the number of films they made. It became all about I.P.’’ — intellectual property. With fewer bets to place, the studios became more cautious. ‘‘The way to cut through the noise is hitching yourself onto something customers have some exposure to already,’’ he says. ‘‘Something familiar. You’re not starting from scratch. If you’re going to work in the studio system, you better have a really big I.P. behind you.’’

This trend toward I.P.-­based movies has been profound. In 1996, of the top 20 grossing films, nine were live-­action movies based on wholly original screenplays. In 2016, just one of the top 20 grossing movies, ‘‘La La Land,’’ fit that bill. Just about everything else was part of the Marvel universe or the DC Comics universe or the ‘‘Harry Potter’’ universe or the ‘‘Star Wars’’ universe or the ‘‘Star Trek’’ universe or the fifth Jason Bourne film or the third ‘‘Kung Fu Panda’’ or a super-­high-­tech remake of ‘‘Jungle Book.’’ Just outside the top 20, there was a remake of ‘‘Ghostbusters’’ and yet another version of ‘‘Tarzan.’’

Want more? You can read the full article here