One Trillion

merlin_139066044_39eb8d46-dac6-4f4b-a912-41b3c035374c-jumbo

The stock market – especially tech – has been down a bit, and it’s easy to forget that not that long ago Apple was valued at one trillion dollars.

There has been a great deal of breathless reporting on this milestone, but much less thoughtful analysis. That’s why I was taken by Jack Nicas’ piece. Here’s how he began:

SAN FRANCISCO — In 1997, Apple was on the ropes. The Silicon Valley pioneer was being decimated by Microsoft and its many partners in the personal-computer market. It had just cut a third of its work force, and it was about 90 days from going broke, Apple’s late co-founder, Steve Jobs, later said.

Recently, Apple became the first publicly traded American company to be worth more than $1 trillion when its shares climbed 3 percent to end the day at $207.39. The gains came two days after the company announced the latest in a series of remarkably profitable quarters.

Apple’s ascent from the brink of bankruptcy to the world’s most valuable public company has been a business tour de force, marked by rapid innovation, a series of smash-hit products and the creation of a sophisticated, globe-spanning supply chain that keeps costs down while producing enormous volumes of cutting-edge devices.

That ascent has also been marked by controversy, tragedy and challenges. Apple’s aggressive use of outside manufacturers in China, for example, has led to criticism that it is taking advantage of poorly paid workers in other countries and robbing Americans of good manufacturing jobs. The company faces numerous questions about how it can continue to grow. This is just a snippet. Want more? You can read the full article here.

Why Gig?

19hyman-jumbo

As you hold your smart phone and consider how it has changed your life, you could be inclined to think that the tech industry alone has created the gig economy. But you would be wrong.

The gig economy is enabled by technology, but technology didn’t create it, it was a result of the insecure nature of work today – which is a far cry from baby-boomers’ parents who went to work for one company and retired at 65 with their gold watch.

I read one of the best explanations of this change in piece entitled: “The Gig Economy Isn’t the iPhone’s Fault. Here’s how it began:

When we learn about the Industrial Revolution in school, we hear a lot about factories, steam engines, maybe the power loom. We are taught that technological innovation drove social change and radically reshaped the world of work.

Likewise, when we talk about today’s economy, we focus on smartphones, artificial intelligence, apps. Here, too, the inexorable march of technology is thought to be responsible for disrupting traditional work, phasing out the employee with a regular wage or salary and phasing in independent contractors, consultants, temps and freelancers — the so-called gig economy.

But this narrative is wrong. The history of labor shows that technology does not usually drive social change. On the contrary, social change is typically driven by decisions we make about how to organize our world. Only later does technology swoop in, accelerating and consolidating those changes.

This insight is crucial for anyone concerned about the insecurity and other shortcomings of the gig economy. For it reminds us that far from being an unavoidable consequence of technological progress, the nature of work always remains a matter of social choice. It is not a result of an algorithm; it is a collection of decisions by corporations and policymakers.

Want more? You can read the full article here

Tech Anniversary

shutterstock_1135450259

San Francisco in 1968 was littered with flower children, free love and dreams of utopia encapsulated in Timothy Leary’s exhortation: “Turn on, tune in, drop out.” How wrong that was! But out of this purple haze rose that year’s Joint Computer Conference, where an assembly of geniuses wearing white short-sleeved shirts and pocket protectors convened 50 years ago this week. The event shined a guiding light on the path to personal computing and set the modern world in motion.

On Dec. 9, 1968, Doug Engelbart of the Stanford Research Institute presented what’s now known as “The Mother of All Demos.” Using a homemade modem, a video feed from Menlo Park, and a quirky hand-operated device, Engelbart gave a 90-minute demonstration of hypertext, videoconferencing, teleconferencing and a networked operating system. Oh, and graphical user interface, display editing, multiple windows, shared documents, context-sensitive help and a digital library. Mother of all demos is right. That quirky device later became known as the computer mouse. The audience felt as if it had stepped into Oz, watching the world transform from black-and-white to color. But it was no hallucination.

So what have we learned in 50 years? First, augmenting humans is the purpose of technology and ought not be feared. Engelbart described the possibilities in a 1970 paper. “There will emerge a new ‘marketplace,’ representing fantastic wealth in commodities of knowledge, service, information, processing, storage,” he predicted. “In the number and range of transactions, and in the speed and flexibility with which they are negotiated, this new market will have a vitality and dynamism as much greater than today’s as today’s is greater than the village market.” Today Google is Memex 1.0, while Amazon and a whole world of e-commerce have realized the digital market.

Minecraft Overdrive

17mag-minecraft-1-articleLarge

Few dispute that games mirror reality and that reality often resembles games. But the game of Minecraft takes this to new levels. And this is driving innovation. Here’s why:

Since its release seven years ago, Minecraft has become a global sensation, captivating a generation of children. There are over 100 million registered players, and it’s now the third-best-­selling video game in history, after Tetris and Wii Sports. In 2014, Microsoft bought Minecraft — and Mojang, the Swedish game studio behind it — for $2.5 billion.

There have been blockbuster games before, of course. But as Jordan’s experience suggests — and as parents peering over their children’s shoulders sense — Minecraft is a different sort of

For one thing, it doesn’t really feel like a game. It’s more like a destination, a technical tool, a cultural scene, or all three put together: a place where kids engineer complex machines, shoot videos of their escapades that they post on YouTube, make art and set up servers, online versions of the game where they can hang out with friends. It’s a world of trial and error and constant discovery, stuffed with byzantine secrets, obscure text commands and hidden recipes. And it runs completely counter to most modern computing trends. Where companies like Apple and Microsoft and Google want our computers to be easy to manipulate — designing point-and-click interfaces under the assumption that it’s best to conceal from the average user how the computer works — Minecraft encourages kids to get under the hood, break things, fix them and turn mooshrooms into random-­number generators. It invites them to tinker.

In this way, Minecraft culture is a throwback to the heady early days of the digital age. In the late ’70s and ’80s, the arrival of personal computers like the Commodore 64 gave rise to the first generation of kids fluent in computation. They learned to program in Basic, to write software that they swapped excitedly with their peers. It was a playful renaissance that eerily parallels the embrace of Minecraft by today’s youth. As Ian Bogost, a game designer and professor of media studies at Georgia Tech, puts it, Minecraft may well be this generation’s personal computer.

Want more? You can read the full article here

Silicon Valley and Saudi Arabia

12giridharadas-superJumbo

Few countries have dominated the international news more recently than Saudi Arabia. While there are conflicting reports regarding who ordered the murder of Saudi journalist Jamal Khashoggi, that fact that he was murdered by fellow countrymen is beyond dispute. At issue for the United States is its long-standing alliance with Saudi Arabia.

But the issues surrounding Khashoggi’s murder have brought to light another, heretofore hidden, issue – the relationship between Silicon Valley and Saudi Arabia.

That’s why I was fascinated by a recent piece regarding this relationship. Here is how it began:

Somewhere in the United States, someone is getting into an Uber en route to a WeWork co-working space. Their dog is with a walker whom they hired through the app Wag. They will eat a lunch delivered by DoorDash, while participating in several chat conversations on Slack. And, for all of it, they have an unlikely benefactor to thank: the Kingdom of Saudi Arabia.

Long before the dissident Saudi journalist Jamal Khashoggi vanished, the kingdom has sought influence in the West — perhaps intended, in part, to make us forget what it is. A medieval theocracy that still beheads by sword, doubling as a modern nation with malls (including a planned mall offering indoor skiing), Saudi Arabia has been called “an ISIS that made it.” Remarkably, the country has avoided pariah status in the United States thanks to our thirst for oil, Riyadh’s carefully cultivated ties with Washington, its big arms purchases, and the two countries’ shared interest in counterterrorism. But lately the Saudis have been growing their circle of American enablers, pouring billions into Silicon Valley technology companies.

While an earlier generation of Saudi leaders, like Prince Alwaleed bin Talal, invested billions of dollars in blue-chip companies in the United States, the kingdom’s new crown prince, Mohammed bin Salman, has shifted Saudi Arabia’s investment attention from Wall Street to Silicon Valley. Saudi Arabia’s Public Investment Fund has become one of Silicon Valley’s biggest swinging checkbooks, working mostly through a $100 billion fund raised by SoftBank (a Japanese company), which has swashbuckled its way through the technology industry, often taking multibillion-dollar stakes in promising companies. The Public Investment Fund put $45 billion into SoftBank’s first Vision Fund, and Bloomberg recently reported that the Saudi fund would invest another $45 billion into SoftBank’s second Vision Fund.

SoftBank, with the help of that Saudi money, is now said to be the largest shareholder in Uber. It has also put significant money into a long list of start-ups that includes Wag, DoorDash, WeWork, Plenty, Cruise, Katerra, Nvidia and Slack. As the world fills up car tanks with gas and climate change worsens, Saudi Arabia reaps enormous profits — and some of that money shows up in the bank accounts of fast-growing companies that love to talk about “making the world a better place.”

Want more? You can read the full article here

Tech Survivor

26STATE-superJumbo

It is almost a cliché to say that our lives have been unalterably changed by the five “FAANG companies: Most know that FAANG is an acronym for the five most popular and best performing tech stocks in the market, namely Facebook, Apple, Amazon, Netflix, and Alphabet’s Google.

Of these five companies, it seems that Apple is the one that intrigues us the most. That is likely due to the company’s history and also to the charisma of Steve Jobs.

That’s why I was drawn into an article, “How Apple Thrived in a Season of Tech Scandals.” Here’s how Farhad Manjoo begins:

The business world has long been plagued by Apple catastrophists — investors, analysts, rival executives and journalists who look at the world’s most valuable company and proclaim it to be imminently doomed.

The critics’ worry for Apple is understandable, even if their repeated wrongness is a little hilarious. Apple’s two-decade ascent from a near-bankrupt has-been of the personal computer era into the first trillion-dollar corporation has defied every apparent rule in tech.

Companies that make high-priced hardware products aren’t supposed to be as popular, as profitable or as permanent. To a lot of people in tech, Apple’s success can seem like a fluke, and every new hurdle the company has faced — the rise of Android, the death of Steve Jobs, the saturation of the smartphone market, the ascendance of artificial intelligence and cloud software — has looked certain to do it in.

But this year, as it begins to roll out a new set of iPhones, the storyline surrounding Apple has improbably shifted. In an era of growing skepticism about the tech industry’s impact on society, Apple’s business model is turning out to be its most lasting advantage.

Because Apple makes money by selling phones rather than advertising, it has been able to hold itself up as a guardian against a variety of digital plagues: a defender of your privacy, an agitator against misinformation and propaganda, and even a plausible warrior against tech addiction, a problem enabled by the very irresistibility of its own devices.

Though it is already more profitable than any of its rivals, Apple appears likely to emerge even stronger from tech’s season of crisis. In the long run, its growing strength could profoundly alter the industry.

Want more? You can read the full article here

Social Media

merlin_139394538_bd3c11a4-053f-4774-a1cb-8c43affb22bd-superJumbo

Much ink has been spilled regarding how much social media impacts our lives – much of it shrill. That’s why I was taken in by a recent piece, “Tweeting Into the Abyss.” The writer reviews Jaron Lanier’s book: “Ten Arguments for Deleting Your Social Media Accounts Right Now.” If that doesn’t get your attention, what will? Here’s how it begins:

My self-justifications were feeble. They could be described as hypocritical even. I had written a book denouncing Facebook, yet maintained an account on Mark Zuckerberg’s manipulation machine. Despite my comprehensive awareness of the perils, I would occasionally indulge in the voyeurism of the News Feed, succumb to zombie scrolling and would take the hit of dopamine that Sean Parker, Facebook’s founding president, has admitted is baked into the product. In internal monologues, I explained my behavior as a professional necessity. How could I describe the perniciousness of the platform if I never used it?

Critics of the big technology companies have refrained from hectoring users to quit social media. It’s far more comfortable to slam a corporate leviathan than it is to shame your aunt or high school pals — or, for that matter, to jettison your own long list of “friends.” As our informational ecosystem has been rubbished, we have placed very little onus on the more than two billion users of Facebook and Twitter. So I’m grateful to Jaron Lanier for redistributing blame on the lumpen-user, for pressing the public to flee social media. He writes, “If you’re not part of the solution, there will be no solution.”

Want more? You can read the full article here

We Like Us

merlin_139618485_c688a36d-9760-4c2a-82c2-02f868015445-superJumbo

One of the things most people agree on is that high self-esteem is good, and low self-esteem is bad. Most of us more-or-less accept that “truth.”

That’s why I was quite taken by the review of “Selfie” a book that tries to get at the root of how we’ve gone from just having self-esteem to being self-obsessed. Here’s how it begins:

Worrying about one’s own narcissism has a whiff of paradox. If we are suffering from self-obsession, should we really feed the disease by poring over another book about ourselves? Well, perhaps just one more.

“Selfie: How We Became So Self-Obsessed and What It’s Doing to Us,” by Will Storr, a British reporter and novelist, is an intriguing odyssey of self-discovery, in two senses. First, it tells a personal tale. Storr confesses to spending much of his time in a state of self-loathing and he would like to know why. On a quest to explore self-esteem and its opposite, he interviews all sorts of people, from CJ, a young American woman whose life revolves around snapping, processing and posting hundreds of thousands of selfies, to John, a vicious London gangster who repented of his selfish ways, possibly because of his mother’s prayers to St. Jude. Storr takes part in encounter groups in California, grills a Benedictine monk cloistered at Pluscarden Abbey in Scotland, and gets academic psychologists to chat frankly about their work. Storr’s side of the conversations he recounts tends to be blunt, inquisitive and peppered with salty British swearing. One comes to like him, even if he does not often like himself.

Want more? You can read the full article here

Battling Moguls – Killer Robots

10MUSK-articleLarge

Earlier this month I posted a blog entry regarding one of the most controversial issues at the nexus of technology and national security is concerns regarding the “militarization” of artificial intelligence – AI.

Initially an issue consigned to just a few defense-related publications and websites, it has now moved front and center. Some of what is said is shrill, but some if far less so.

That’s why I was taken by a piece in the New York Times entitled:

Mark Zuckerberg, Elon Musk and the Feud Over Killer Robots

with the subtitle

As the tech moguls disagree over the risks presented by something that doesn’t exist yet, all of Silicon Valley is learning about unintended consequences of A.I.

Here’s how it begins:

Mark Zuckerberg thought his fellow Silicon Valley billionaire Elon Musk was behaving like an alarmist.

Mr. Musk, the entrepreneur behind SpaceX and the electric-car maker Tesla, had taken it upon himself to warn the world that artificial intelligence was “potentially more dangerous than nukes” in television interviews and on social media

So, on Nov. 19, 2014, Mr. Zuckerberg, Facebook’s chief executive, invited Mr. Musk to dinner at his home in Palo Alto, Calif. Two top researchers from Facebook’s new artificial intelligence lab and two other Facebook executives joined them.

As they ate, the Facebook contingent tried to convince Mr. Musk that he was wrong. But he wasn’t budging. “I genuinely believe this is dangerous,” Mr. Musk told the table, according to one of the dinner’s attendees, Yann LeCun, the researcher who led Facebook’s A.I. lab.

Mr. Musk’s fears of A.I., distilled to their essence, were simple: If we create machines that are smarter than humans, they could turn against us. (See: “The Terminator,” “The Matrix,” and “2001: A Space Odyssey.”) Let’s for once, he was saying to the rest of the tech industry, consider the unintended consequences of what we are creating before we unleash it on the world.

Neither Mr. Musk nor Mr. Zuckerberg would talk in detail about the dinner, which has not been reported before, or their long-running A.I. debate.

The creation of “superintelligence” — the name for the supersmart technological breakthrough that takes A.I. to the next level and creates machines that not only perform narrow tasks that typically require human intelligence (like self-driving cars) but can actually outthink humans — still feels like science fiction. But the fight over the future of A.I. has spread across the tech industry.

You can read the full article here

AI and National Security

merlin_125786690_63c7d7ce-6111-4a5c-9dc9-d42189e3b937-superJumbo

One of the most controversial issues at the nexus of technology and national security is concerns regarding the “militarization” of artificial intelligence – AI.

While this has been an issue for some time, it recently grabbed banner headlines regarding the issue of Google’s support for a Pentagon initiative called “Project Maven.”

The company’s relationship with the Defense Department since it won a share of the contract for the Maven program, which uses artificial intelligence to interpret video images and could be used to improve the targeting of drone strikes, has touched off an existential crisis, according to emails and documents reviewed by The Times as well as interviews with about a dozen current and former Google employees.

Google, hoping to head off a rebellion by employees upset that the technology they were working on could be used for lethal purposes, will not renew a contract with the Pentagon for artificial intelligence work when a current deal expires next year.

But it is not unusual for Silicon Valley’s big companies to have deep military ties. And the internal dissent over Maven stands in contrast to Google’s biggest competitors for selling cloud-computing services — Amazon.com and Microsoft — which have aggressively pursued Pentagon contracts without pushback from their employees.

Expect this issue to remain controversial as the U.S. military faces increasingly capable foes and as AI and machine learning offer ways to help our warfighters prevail.

You can read these two articles here and here