Our Phones – Ourselves

27newport-superJumbo

Are you reading this on your phone? It’s likely that you are and that your smart phone is such a constant companion that it is on your person 24/7.

Was this the plan when Steve Jobs first introduced this magical device? Not at all suggests Cal Newport. Here is how he began his insightful piece:

Smartphones are our constant companions. For many of us, their glowing screens are a ubiquitous presence, drawing us in with endless diversions, like the warm ping of social approval delivered in the forms of likes and retweets, and the algorithmically amplified outrage of the latest “breaking” news or controversy. They’re in our hands, as soon as we wake, and command our attention until the final moments before we fall asleep.

Steve Jobs would not approve.

In 2007, Mr. Jobs took the stage at the Moscone Convention Center in San Francisco and introduced the world to the iPhone. If you watch the full speech, you’ll be surprised by how he imagined our relationship with this iconic invention, because this vision is so different from the way most of us use these devices now.

In the remarks, after discussing the phone’s interface and hardware, he spends an extended amount of time demonstrating how the device leverages the touch screen before detailing the many ways Apple engineers improved the age-old process of making phone calls. “It’s the best iPod we’ve ever made,” Mr. Jobs exclaims at one point. “The killer app is making calls,” he later adds. Both lines spark thunderous applause. He doesn’t dedicate any significant time to discussing the phone’s internet connectivity features until more than 30 minutes into the address.

The presentation confirms that Mr. Jobs envisioned a simpler and more constrained iPhone experience than the one we actually have over a decade later. For example, he doesn’t focus much on apps. When the iPhone was first introduced there was no App Store, and this was by design. As Andy Grignon, an original member of the iPhone team, told me when I was researching this topic, Mr. Jobs didn’t trust third-party developers to offer the same level of aesthetically pleasing and stable experiences that Apple programmers could produce. He was convinced that the phone’s carefully designed native features were enough. It was “an iPod that made phone calls,” Mr. Grignon said to me.

Mr. Jobs seemed to understand the iPhone as something that would help us with a small number of activities — listening to music, placing calls, generating directions. He didn’t seek to radically change the rhythm of users’ daily lives. He simply wanted to take experiences we already found important and make them better.

The minimalist vision for the iPhone he offered in 2007 is unrecognizable today — and that’s a shame.

Under what I call the “constant companion model,” we now see our smartphones as always-on portals to information. Instead of improving activities that we found important before this technology existed, this model changes what we pay attention to in the first place — often in ways designed to benefit the stock price of attention-economy conglomerates, not our satisfaction and well-being.

Want more? You can read the full article here

Hold the Obits!

Opinion The Comeback of the Century The New York Times

Over the past decade, countless obituaries have been written for books. Many were convinced the printed book would soon suffer the same fate as the dinosaurs.

Hold the obits!

The printed book is making a huge comeback. I had been aware of this in bits and pieces, but am grateful to Timothy for helping with some perspective. Here’s how he began a recent op-ed:

Not long ago I found myself inside the hushed and high-vaulted interior of a nursing home for geriatric books, in the forgotten city of St.-Omer, France. Running my white-gloved hands over the pages of a thousand-year-old manuscript, I was amazed at the still-bright colors applied long ago in a chilly medieval scriptorium. Would anything written today still be around to touch in another millennium?

In the digital age, the printed book has experienced more than its share of obituaries. Among the most dismissive was one from Steve Jobs, who said in 2008, “It doesn’t matter how good or bad the product is, the fact is that people don’t read anymore.”

True, nearly one in four adults in this country has not read a book in the last year. But the book — with a spine, a unique scent, crisp pages and a typeface that may date to Shakespeare’s day — is back. Defying all death notices, sales of printed books continue to rise to new highs, as do the number of independent stores stocked with these voices between covers, even as sales of electronic versions are declining.

Nearly three times as many Americans read a book of history in 2017 as watched the first episode of the final season of “Game of Thrones.” The share of young adults who read poetry in that year more than doubled from five years earlier. A typical rage tweet by President Trump, misspelled and grammatically sad, may get him 100,000 “likes.” Compare that with the 28 million Americans who read a book of verse in the first year of Trump’s presidency, the highest share of the population in 15 years.

So, even with a president who is ahistoric, borderline literate and would fail a sixth-grade reading comprehension test, something wonderful and unexpected is happening in the language arts. When the dominant culture goes low, the saviors of our senses go high.

Want more? You can read it here

Artificial Intelligence

Galdorisi May 19 pdf

Artificial intelligence (AI) may be the most beneficial technological development of the 21st century.  However, it is undoubtedly the most hyped technological development of the past two decades.  This hype has raised expectations for results and, unfortunately, has clouded public understanding of the true nature of AI and its limitations as well as potential.

The highest level U.S. security documents demonstrate a recognition of the power of AI to support U.S. national objectives.

The National Security Strategy notes: “New advances in computing, autonomy, and manufacturing are already transforming the way we fight…From self-driving cars to autonomous weapons, the field of [AI], in particular, is progressing rapidly.”

The National Defense Strategy puts it this way: “The security environment is also affected by rapid technological advancements and the changing character of war…New technologies include artificial intelligence [and] autonomy.”

The U.S. Navy knows it needs AI, but would be well-served if it articulates these needs better so it can harness this critical technology.

I addressed this subject in my recent U.S. Naval Institute Proceedings article, The Navy Needs AI, It Just Isn’t Certain Why.”  Want more? You can read the full article here

AI and Humans

merlin_151980906_58d03ca7-9863-4f63-83fd-463608d6d89e-superJumbo

We have posted a great deal about artificial intelligence and its impact on us. That’s why I was drawn to a recent article, “Without Humans, A.I Can Wreak Havoc.” Here’s how it begins:

The year 1989 is often remembered for events that challenged the Cold War world order, from the protests in Tiananmen Square to the fall of the Berlin Wall. It is less well remembered for what is considered the birth of the World Wide Web. In March of 1989, the British researcher Tim Berners-Lee shared the protocols, including HTML, URL and HTTP that enabled the internet to become a place of communication and collaboration across the globe.

As the World Wide Web marks its 30th birthday on Tuesday, public discourse is dominated by alarm about Big Tech, data privacy and viral disinformation. Tech executives have been called to testify before Congress, a popular campaign dissuaded Amazon from opening a second headquarters in New York and the United Kingdom is going after social media companies that it calls “digital gangsters.” Implicit in this tech-lash is nostalgia for a more innocent online era.

But longing for a return to the internet’s yesteryears isn’t constructive. In the early days, access to the web was expensive and exclusive, and it was not reflective or inclusive of society as a whole. What is worth revisiting is less how it felt or operated, but what the early web stood for. Those first principles of creativity, connection and collaboration are worth reconsidering today as we reflect on the past and the future promise of our digitized society.

The early days of the internet were febrile with dreams about how it might transform our world, connecting the planet and democratizing access to knowledge and power. It has certainly effected great change, if not always what its founders anticipated. If a new democratic global commons didn’t quite emerge, a new demos certainly did: An internet of people who created it, shared it and reciprocated in its use.

This is just a snippet. Want more? You can read the full article here.

Phone Time

00well_phone_livelonger-superJumbo-v4

Much ink has been spilled talking about how our smart phones can dominate our lives. But can they shorten them? Catherine Price thinks so. Here’s how she began a recent piece:

If you’re like many people, you may have decided that you want to spend less time staring at your phone.

It’s a good idea: an increasing body of evidence suggests that the time we spend on our smartphones is interfering with our sleep, self-esteem, relationships, memory, attention spans, creativity, productivity and problem-solving and decision-making skills.

But there is another reason for us to rethink our relationships with our devices. By chronically raising levels of cortisol, the body’s main stress hormone, our phones may be threatening our health and shortening our lives.

Until now, most discussions of phones’ biochemical effects have focused on dopamine, a brain chemical that helps us form habits — and addictions. Like slot machines, smartphones and apps are explicitly designed to trigger dopamine’s release, with the goal of making our devices difficult to put down.

This manipulation of our dopamine systems is why many experts believe that we are developing behavioral addictions to our phones. But our phones’ effects on cortisol are potentially even more alarming.

Cortisol is our primary fight-or-flight hormone. Its release triggers physiological changes, such as spikes in blood pressure, heart rate and blood sugar, that help us react to and survive acute physical threats.

These effects can be lifesaving if you are actually in physical danger — like, say, you’re being charged by a bull. But our bodies also release cortisol in response to emotional stressors where an increased heart rate isn’t going to do much good, such as checking your phone to find an angry email from your boss.

Want more? You can read it here

High-Tech Weapons

DEW_Article pdf

Earlier this year, I posted blogs regarding the new directions for U.S. National Security embodied in publications such as the National Security Strategy and the National Defense Strategy. Each of these publications notes that the U.S. military must adopt high-technology to ensure the U.S. can deal with increasingly capable peer competitors.

The era of United States technological dominance has ended. Indeed, in many areas, including military technology, this gap has narrowed to parity or near-parity, and potential adversaries have all-but erased what was once the U.S. military’s trump card—superior technology. Nations such as Russia and China, as well as countries to which these nations proliferate weapons, are deploying advanced weapons that demonstrate many of the same technological strengths that have traditionally provided the high-tech basis for U.S. advantage.

One of the most promising emerging military technologies is directed-energy weapons. The U.S. military already uses many directed-energy systems such as laser range finders and targeting systems are deployed on tanks, helicopters, tactical fighters and sniper rifles. These laser systems provide both swifter engagements and greatly enhanced precision by shortening of the sensor-to-shooter cycle.

Now, directed-energy weapons are poised to shorten––often dramatically––the shooter-to-target cycle. Directed-energy weapons provide a means for instantaneous target engagement, with extremely high accuracy and at long ranges.

This is just a snippet. Want more? You can read the full article here

Internet Cleanse

28brooksWeb-superJumbo

Among the most popular diet ideas are those that involve “cleansing” of some type. While there are many ways to do this, most involve the need to stop eating what you currently eat.

I’ve always wondered if we could – and should – apply that to our lives on the internet. I know it internally, but couldn’t articulate it. Thankfully, David Brooks did. Here is part of what he said:

The two most recent times I saw my friend Makoto Fujimura, he put a Kintsugi bowl in my hands. These ceramic bowls were 300 to 400 years old. But what made them special was that somewhere along the way they had broken into shards and were glued back together with a 15th-century technique using Japanese lacquer and gold.

I don’t know about you, but I feel a great hunger right now for timeless pieces like these. The internet has accelerated our experience of time, and Donald Trump has upped the pace of events to permanent frenetic.

There is a rapid, dirty river of information coursing through us all day. If you’re in the news business, or a consumer of the news business, your reaction to events has to be instant or it is outdated. If you’re on social media, there are these swarming mobs who rise out of nowhere, leave people broken and do not stick around to perform the patient Kintsugi act of gluing them back together.

Probably like you, I’ve felt a great need to take a break from this pace every once in a while and step into a slower dimension of time. Mako’s paintings are very good for these moments.

What would it mean to live generationally once in a while, in a world that now finds the daily newspaper too slow?

Want more? You can read it here

Who Codes

17mag-coders-pics-slide-ELU0-articleLarge

In his best-selling book, The Innovators, Walter Isaacson, traces to roots of the computer revolution to mathematician Ada Lovelace, Lord Byron’s daughter, who lived from 1815-1852.

His history moves into the twentieth century and highlights another female computer pioneer, Grace Hooper.

And it is well known that women were some of earliest coders and software pioneers over a half-century ago.

But where are female coders today? I’d always wondered, until I read Clive Thompson’s recent piece, “The Secret History of Women in Coding.” Here’s how he begins:

As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them. So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding. Want more? You can read it here

Facebook

03Bissell-jumbo

It’s been about 5 years since Jim Cramer and Bob Lang coined the acronym “FANG” for mega-cap high growth stocks Facebook, Amazon, Netflix and Alphabet Google.

And while it just happens to lead a handy acronym, Facebook is quite possibly the most controversial tech company of all time.

For most, this is due to one person, Facebook’s CEO, Mark Zuckerberg. It has been almost a decade since the movie about Facebook and its founder, The Social Network, hit with such force.

We remain fascinated by Facebook and Zuckerberg. We want to learn more, but we want something different. That’s why I was drawn in by a book review for “Zucked.” Here’s how it begins:

The dystopia George Orwell conjured up in “1984” wasn’t a prediction. It was, instead, a reflection. Newspeak, the Ministry of Truth, the Inner Party, the Outer Party — that novel sampled and remixed a reality that Nazi and Soviet totalitarianism had already made apparent. Scary stuff, certainly, but maybe the more frightening dystopia is the one no one warned you about, the one you wake up one morning to realize you’re living inside.

Roger McNamee, an esteemed venture capitalist, would appear to agree. “A dystopian technology future overran our lives before we were ready,” he writes in “Zucked.” Think that sounds like overstatement? Let’s examine the evidence. At its peak the planet’s fourth most valuable company, and arguably its most influential, is controlled almost entirely by a young man with the charisma of a geometry T.A. The totality of this man’s professional life has been running this company, which calls itself “a platform.”

Company, platform — whatever it is, it provides a curious service wherein billions of people fill it with content: baby photos, birthday wishes, concert promotions, psychotic premonitions of Jewish lizard-men. No one is paid by the company for this labor; on the contrary, users are rewarded by being tracked across the web, even when logged out, and consequently strip-mined by a complicated artificial intelligence trained to sort surveilled information into approximately 29,000 predictive data points, which are then made available to advertisers and other third parties, who now know everything that can be known about a person without trepanning her skull. Amazingly, none of this is secret, despite the company’s best efforts to keep it so. Somehow, people still use and love this platform. Want more? You can read the full article here

Our World – Our Minds?

08Herrman-jumbo

It’s not much of a stretch to say that stories about big tech have dominated the headlines in recent years. We’ve all read them – and many of them are less-than-flattering.

That’s why I gravitated to a new book: “World Without Mind: The Existential Threat of Big Tech.” While I have my own – strong – opinions about what the big says, I found John Herrman’s review of the book clarifying in explaining the import of this book. Here is how he begins:

The technology critic is typically a captive figure, beholden either to a sorrowful past, a panicked present or an arrogant future. In his proudest moments, he resembles something like a theorist of transformation, decline and creation. In his lowest, he is more like a speaking canary, prone to prophecy, a game with losing odds. His attempts at optimism are framed as counterintuitive, faring little better, in predictive terms, than his lapses into pessimism. He teeters hazardously between implicating his audience and merely giving their anxieties a name. He — and it is almost always a he — is the critical equivalent of an unreliable narrator, unable to write about technology without also writing about himself. Occasionally, he is right: about what is happening, about what should happen, and about what it means. And so he carries on, and his audience with him.

Franklin Foer, thankfully, recognizes these pitfalls even if he can’t always avoid them. Who can? The melodramatically titled “World Without Mind,” Foer’s compact attempt at a broad technological polemic — which identifies the stupendous successes of Amazon, Google and Facebook, among others, as an “existential threat” to the individual and to society — begins with a disclaimer. Foer’s tumultuous stint editing The New Republic under the ownership of the Facebook co-founder Chris Hughes ended with mass resignations and public acrimony. “There’s no doubt that this experience informs the argument of this book,” he writes. He is likewise entangled through his proximity to publishing: The author’s friends, colleagues and immediate family members — including his brother, the novelist Jonathan Safran Foer — depend to different degrees on the industry Amazon consumed first. The book is dedicated to his father, Bert, a crusading antitrust lawyer.

In this slightly crouched posture, and with a hint of healthy self-doubt, Foer proceeds quickly. We, the consuming public, have failed to properly understand the new tech superpowers, he suggests, leaving little hope for stodgy and reluctant American regulators. The scope of their influence is obscured by the sheer number of things they do and sell, or problems they purport to be solving, and by our outdated sense of what constitutes a monopoly. To that end, Foer promotes the concept of the “knowledge monopoly,” which he qualifies with a mischievous grin. “My hope is that we revive ‘monopoly’ as a core piece of political rhetoric that broadly denotes dominant firms with pernicious powers,” he says, rather than as a “technical” term referring to one company cornerning a market. (His new monopolists, after all, aren’t raising prices. They’re giving things away free). Want more? You can read the full article here