Who Codes

17mag-coders-pics-slide-ELU0-articleLarge

In his best-selling book, The Innovators, Walter Isaacson, traces to roots of the computer revolution to mathematician Ada Lovelace, Lord Byron’s daughter, who lived from 1815-1852.

His history moves into the twentieth century and highlights another female computer pioneer, Grace Hooper.

And it is well known that women were some of earliest coders and software pioneers over a half-century ago.

But where are female coders today? I’d always wondered, until I read Clive Thompson’s recent piece, “The Secret History of Women in Coding.” Here’s how he begins:

As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them. So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding. Want more? You can read it here

Facebook

03Bissell-jumbo

It’s been about 5 years since Jim Cramer and Bob Lang coined the acronym “FANG” for mega-cap high growth stocks Facebook, Amazon, Netflix and Alphabet Google.

And while it just happens to lead a handy acronym, Facebook is quite possibly the most controversial tech company of all time.

For most, this is due to one person, Facebook’s CEO, Mark Zuckerberg. It has been almost a decade since the movie about Facebook and its founder, The Social Network, hit with such force.

We remain fascinated by Facebook and Zuckerberg. We want to learn more, but we want something different. That’s why I was drawn in by a book review for “Zucked.” Here’s how it begins:

The dystopia George Orwell conjured up in “1984” wasn’t a prediction. It was, instead, a reflection. Newspeak, the Ministry of Truth, the Inner Party, the Outer Party — that novel sampled and remixed a reality that Nazi and Soviet totalitarianism had already made apparent. Scary stuff, certainly, but maybe the more frightening dystopia is the one no one warned you about, the one you wake up one morning to realize you’re living inside.

Roger McNamee, an esteemed venture capitalist, would appear to agree. “A dystopian technology future overran our lives before we were ready,” he writes in “Zucked.” Think that sounds like overstatement? Let’s examine the evidence. At its peak the planet’s fourth most valuable company, and arguably its most influential, is controlled almost entirely by a young man with the charisma of a geometry T.A. The totality of this man’s professional life has been running this company, which calls itself “a platform.”

Company, platform — whatever it is, it provides a curious service wherein billions of people fill it with content: baby photos, birthday wishes, concert promotions, psychotic premonitions of Jewish lizard-men. No one is paid by the company for this labor; on the contrary, users are rewarded by being tracked across the web, even when logged out, and consequently strip-mined by a complicated artificial intelligence trained to sort surveilled information into approximately 29,000 predictive data points, which are then made available to advertisers and other third parties, who now know everything that can be known about a person without trepanning her skull. Amazingly, none of this is secret, despite the company’s best efforts to keep it so. Somehow, people still use and love this platform. Want more? You can read the full article here

Our World – Our Minds?

08Herrman-jumbo

It’s not much of a stretch to say that stories about big tech have dominated the headlines in recent years. We’ve all read them – and many of them are less-than-flattering.

That’s why I gravitated to a new book: “World Without Mind: The Existential Threat of Big Tech.” While I have my own – strong – opinions about what the big says, I found John Herrman’s review of the book clarifying in explaining the import of this book. Here is how he begins:

The technology critic is typically a captive figure, beholden either to a sorrowful past, a panicked present or an arrogant future. In his proudest moments, he resembles something like a theorist of transformation, decline and creation. In his lowest, he is more like a speaking canary, prone to prophecy, a game with losing odds. His attempts at optimism are framed as counterintuitive, faring little better, in predictive terms, than his lapses into pessimism. He teeters hazardously between implicating his audience and merely giving their anxieties a name. He — and it is almost always a he — is the critical equivalent of an unreliable narrator, unable to write about technology without also writing about himself. Occasionally, he is right: about what is happening, about what should happen, and about what it means. And so he carries on, and his audience with him.

Franklin Foer, thankfully, recognizes these pitfalls even if he can’t always avoid them. Who can? The melodramatically titled “World Without Mind,” Foer’s compact attempt at a broad technological polemic — which identifies the stupendous successes of Amazon, Google and Facebook, among others, as an “existential threat” to the individual and to society — begins with a disclaimer. Foer’s tumultuous stint editing The New Republic under the ownership of the Facebook co-founder Chris Hughes ended with mass resignations and public acrimony. “There’s no doubt that this experience informs the argument of this book,” he writes. He is likewise entangled through his proximity to publishing: The author’s friends, colleagues and immediate family members — including his brother, the novelist Jonathan Safran Foer — depend to different degrees on the industry Amazon consumed first. The book is dedicated to his father, Bert, a crusading antitrust lawyer.

In this slightly crouched posture, and with a hint of healthy self-doubt, Foer proceeds quickly. We, the consuming public, have failed to properly understand the new tech superpowers, he suggests, leaving little hope for stodgy and reluctant American regulators. The scope of their influence is obscured by the sheer number of things they do and sell, or problems they purport to be solving, and by our outdated sense of what constitutes a monopoly. To that end, Foer promotes the concept of the “knowledge monopoly,” which he qualifies with a mischievous grin. “My hope is that we revive ‘monopoly’ as a core piece of political rhetoric that broadly denotes dominant firms with pernicious powers,” he says, rather than as a “technical” term referring to one company cornerning a market. (His new monopolists, after all, aren’t raising prices. They’re giving things away free). Want more? You can read the full article here

One Trillion

merlin_139066044_39eb8d46-dac6-4f4b-a912-41b3c035374c-jumbo

The stock market – especially tech – has been down a bit, and it’s easy to forget that not that long ago Apple was valued at one trillion dollars.

There has been a great deal of breathless reporting on this milestone, but much less thoughtful analysis. That’s why I was taken by Jack Nicas’ piece. Here’s how he began:

SAN FRANCISCO — In 1997, Apple was on the ropes. The Silicon Valley pioneer was being decimated by Microsoft and its many partners in the personal-computer market. It had just cut a third of its work force, and it was about 90 days from going broke, Apple’s late co-founder, Steve Jobs, later said.

Recently, Apple became the first publicly traded American company to be worth more than $1 trillion when its shares climbed 3 percent to end the day at $207.39. The gains came two days after the company announced the latest in a series of remarkably profitable quarters.

Apple’s ascent from the brink of bankruptcy to the world’s most valuable public company has been a business tour de force, marked by rapid innovation, a series of smash-hit products and the creation of a sophisticated, globe-spanning supply chain that keeps costs down while producing enormous volumes of cutting-edge devices.

That ascent has also been marked by controversy, tragedy and challenges. Apple’s aggressive use of outside manufacturers in China, for example, has led to criticism that it is taking advantage of poorly paid workers in other countries and robbing Americans of good manufacturing jobs. The company faces numerous questions about how it can continue to grow. This is just a snippet. Want more? You can read the full article here.

Why Gig?

19hyman-jumbo

As you hold your smart phone and consider how it has changed your life, you could be inclined to think that the tech industry alone has created the gig economy. But you would be wrong.

The gig economy is enabled by technology, but technology didn’t create it, it was a result of the insecure nature of work today – which is a far cry from baby-boomers’ parents who went to work for one company and retired at 65 with their gold watch.

I read one of the best explanations of this change in piece entitled: “The Gig Economy Isn’t the iPhone’s Fault. Here’s how it began:

When we learn about the Industrial Revolution in school, we hear a lot about factories, steam engines, maybe the power loom. We are taught that technological innovation drove social change and radically reshaped the world of work.

Likewise, when we talk about today’s economy, we focus on smartphones, artificial intelligence, apps. Here, too, the inexorable march of technology is thought to be responsible for disrupting traditional work, phasing out the employee with a regular wage or salary and phasing in independent contractors, consultants, temps and freelancers — the so-called gig economy.

But this narrative is wrong. The history of labor shows that technology does not usually drive social change. On the contrary, social change is typically driven by decisions we make about how to organize our world. Only later does technology swoop in, accelerating and consolidating those changes.

This insight is crucial for anyone concerned about the insecurity and other shortcomings of the gig economy. For it reminds us that far from being an unavoidable consequence of technological progress, the nature of work always remains a matter of social choice. It is not a result of an algorithm; it is a collection of decisions by corporations and policymakers.

Want more? You can read the full article here

Tech Anniversary

shutterstock_1135450259

San Francisco in 1968 was littered with flower children, free love and dreams of utopia encapsulated in Timothy Leary’s exhortation: “Turn on, tune in, drop out.” How wrong that was! But out of this purple haze rose that year’s Joint Computer Conference, where an assembly of geniuses wearing white short-sleeved shirts and pocket protectors convened 50 years ago this week. The event shined a guiding light on the path to personal computing and set the modern world in motion.

On Dec. 9, 1968, Doug Engelbart of the Stanford Research Institute presented what’s now known as “The Mother of All Demos.” Using a homemade modem, a video feed from Menlo Park, and a quirky hand-operated device, Engelbart gave a 90-minute demonstration of hypertext, videoconferencing, teleconferencing and a networked operating system. Oh, and graphical user interface, display editing, multiple windows, shared documents, context-sensitive help and a digital library. Mother of all demos is right. That quirky device later became known as the computer mouse. The audience felt as if it had stepped into Oz, watching the world transform from black-and-white to color. But it was no hallucination.

So what have we learned in 50 years? First, augmenting humans is the purpose of technology and ought not be feared. Engelbart described the possibilities in a 1970 paper. “There will emerge a new ‘marketplace,’ representing fantastic wealth in commodities of knowledge, service, information, processing, storage,” he predicted. “In the number and range of transactions, and in the speed and flexibility with which they are negotiated, this new market will have a vitality and dynamism as much greater than today’s as today’s is greater than the village market.” Today Google is Memex 1.0, while Amazon and a whole world of e-commerce have realized the digital market.

Minecraft Overdrive

17mag-minecraft-1-articleLarge

Few dispute that games mirror reality and that reality often resembles games. But the game of Minecraft takes this to new levels. And this is driving innovation. Here’s why:

Since its release seven years ago, Minecraft has become a global sensation, captivating a generation of children. There are over 100 million registered players, and it’s now the third-best-­selling video game in history, after Tetris and Wii Sports. In 2014, Microsoft bought Minecraft — and Mojang, the Swedish game studio behind it — for $2.5 billion.

There have been blockbuster games before, of course. But as Jordan’s experience suggests — and as parents peering over their children’s shoulders sense — Minecraft is a different sort of

For one thing, it doesn’t really feel like a game. It’s more like a destination, a technical tool, a cultural scene, or all three put together: a place where kids engineer complex machines, shoot videos of their escapades that they post on YouTube, make art and set up servers, online versions of the game where they can hang out with friends. It’s a world of trial and error and constant discovery, stuffed with byzantine secrets, obscure text commands and hidden recipes. And it runs completely counter to most modern computing trends. Where companies like Apple and Microsoft and Google want our computers to be easy to manipulate — designing point-and-click interfaces under the assumption that it’s best to conceal from the average user how the computer works — Minecraft encourages kids to get under the hood, break things, fix them and turn mooshrooms into random-­number generators. It invites them to tinker.

In this way, Minecraft culture is a throwback to the heady early days of the digital age. In the late ’70s and ’80s, the arrival of personal computers like the Commodore 64 gave rise to the first generation of kids fluent in computation. They learned to program in Basic, to write software that they swapped excitedly with their peers. It was a playful renaissance that eerily parallels the embrace of Minecraft by today’s youth. As Ian Bogost, a game designer and professor of media studies at Georgia Tech, puts it, Minecraft may well be this generation’s personal computer.

Want more? You can read the full article here

Silicon Valley and Saudi Arabia

12giridharadas-superJumbo

Few countries have dominated the international news more recently than Saudi Arabia. While there are conflicting reports regarding who ordered the murder of Saudi journalist Jamal Khashoggi, that fact that he was murdered by fellow countrymen is beyond dispute. At issue for the United States is its long-standing alliance with Saudi Arabia.

But the issues surrounding Khashoggi’s murder have brought to light another, heretofore hidden, issue – the relationship between Silicon Valley and Saudi Arabia.

That’s why I was fascinated by a recent piece regarding this relationship. Here is how it began:

Somewhere in the United States, someone is getting into an Uber en route to a WeWork co-working space. Their dog is with a walker whom they hired through the app Wag. They will eat a lunch delivered by DoorDash, while participating in several chat conversations on Slack. And, for all of it, they have an unlikely benefactor to thank: the Kingdom of Saudi Arabia.

Long before the dissident Saudi journalist Jamal Khashoggi vanished, the kingdom has sought influence in the West — perhaps intended, in part, to make us forget what it is. A medieval theocracy that still beheads by sword, doubling as a modern nation with malls (including a planned mall offering indoor skiing), Saudi Arabia has been called “an ISIS that made it.” Remarkably, the country has avoided pariah status in the United States thanks to our thirst for oil, Riyadh’s carefully cultivated ties with Washington, its big arms purchases, and the two countries’ shared interest in counterterrorism. But lately the Saudis have been growing their circle of American enablers, pouring billions into Silicon Valley technology companies.

While an earlier generation of Saudi leaders, like Prince Alwaleed bin Talal, invested billions of dollars in blue-chip companies in the United States, the kingdom’s new crown prince, Mohammed bin Salman, has shifted Saudi Arabia’s investment attention from Wall Street to Silicon Valley. Saudi Arabia’s Public Investment Fund has become one of Silicon Valley’s biggest swinging checkbooks, working mostly through a $100 billion fund raised by SoftBank (a Japanese company), which has swashbuckled its way through the technology industry, often taking multibillion-dollar stakes in promising companies. The Public Investment Fund put $45 billion into SoftBank’s first Vision Fund, and Bloomberg recently reported that the Saudi fund would invest another $45 billion into SoftBank’s second Vision Fund.

SoftBank, with the help of that Saudi money, is now said to be the largest shareholder in Uber. It has also put significant money into a long list of start-ups that includes Wag, DoorDash, WeWork, Plenty, Cruise, Katerra, Nvidia and Slack. As the world fills up car tanks with gas and climate change worsens, Saudi Arabia reaps enormous profits — and some of that money shows up in the bank accounts of fast-growing companies that love to talk about “making the world a better place.”

Want more? You can read the full article here

Tech Survivor

26STATE-superJumbo

It is almost a cliché to say that our lives have been unalterably changed by the five “FAANG companies: Most know that FAANG is an acronym for the five most popular and best performing tech stocks in the market, namely Facebook, Apple, Amazon, Netflix, and Alphabet’s Google.

Of these five companies, it seems that Apple is the one that intrigues us the most. That is likely due to the company’s history and also to the charisma of Steve Jobs.

That’s why I was drawn into an article, “How Apple Thrived in a Season of Tech Scandals.” Here’s how Farhad Manjoo begins:

The business world has long been plagued by Apple catastrophists — investors, analysts, rival executives and journalists who look at the world’s most valuable company and proclaim it to be imminently doomed.

The critics’ worry for Apple is understandable, even if their repeated wrongness is a little hilarious. Apple’s two-decade ascent from a near-bankrupt has-been of the personal computer era into the first trillion-dollar corporation has defied every apparent rule in tech.

Companies that make high-priced hardware products aren’t supposed to be as popular, as profitable or as permanent. To a lot of people in tech, Apple’s success can seem like a fluke, and every new hurdle the company has faced — the rise of Android, the death of Steve Jobs, the saturation of the smartphone market, the ascendance of artificial intelligence and cloud software — has looked certain to do it in.

But this year, as it begins to roll out a new set of iPhones, the storyline surrounding Apple has improbably shifted. In an era of growing skepticism about the tech industry’s impact on society, Apple’s business model is turning out to be its most lasting advantage.

Because Apple makes money by selling phones rather than advertising, it has been able to hold itself up as a guardian against a variety of digital plagues: a defender of your privacy, an agitator against misinformation and propaganda, and even a plausible warrior against tech addiction, a problem enabled by the very irresistibility of its own devices.

Though it is already more profitable than any of its rivals, Apple appears likely to emerge even stronger from tech’s season of crisis. In the long run, its growing strength could profoundly alter the industry.

Want more? You can read the full article here

Social Media

merlin_139394538_bd3c11a4-053f-4774-a1cb-8c43affb22bd-superJumbo

Much ink has been spilled regarding how much social media impacts our lives – much of it shrill. That’s why I was taken in by a recent piece, “Tweeting Into the Abyss.” The writer reviews Jaron Lanier’s book: “Ten Arguments for Deleting Your Social Media Accounts Right Now.” If that doesn’t get your attention, what will? Here’s how it begins:

My self-justifications were feeble. They could be described as hypocritical even. I had written a book denouncing Facebook, yet maintained an account on Mark Zuckerberg’s manipulation machine. Despite my comprehensive awareness of the perils, I would occasionally indulge in the voyeurism of the News Feed, succumb to zombie scrolling and would take the hit of dopamine that Sean Parker, Facebook’s founding president, has admitted is baked into the product. In internal monologues, I explained my behavior as a professional necessity. How could I describe the perniciousness of the platform if I never used it?

Critics of the big technology companies have refrained from hectoring users to quit social media. It’s far more comfortable to slam a corporate leviathan than it is to shame your aunt or high school pals — or, for that matter, to jettison your own long list of “friends.” As our informational ecosystem has been rubbished, we have placed very little onus on the more than two billion users of Facebook and Twitter. So I’m grateful to Jaron Lanier for redistributing blame on the lumpen-user, for pressing the public to flee social media. He writes, “If you’re not part of the solution, there will be no solution.”

Want more? You can read the full article here