Cyber War

08hacking6-superJumbo-v3

We are inundated by information about the cyber world – most of it negative. Russian hacking designed to influence the American election, Wiki-leaks, NSA monitoring of phone conversations, scammers emptying bank accounts and the like. The information comes at us so fast and furious that for most of us, there’s more heat than light.

That’s why I found the article, “The @ – Bomb” so revealing. It lights a candle on the extent of the threat cyber poses to nations, as well as to individuals.

I’ve poured over the article looking for snippets to share with you, but it’s a fool’s errand. Spend some time devouring this great piece. You won’t be disappointed.

Thought provoking? You can read the full article here.

Disrupted?

18wieseltier-master1050

In previous blog posts about technology, we’ve addressed the always controversial subject of technology being a master or a servant. Like many issues, the jury is still out.

 

In a recent piece in The New York Times Book Review, Leon Wieseltier addresses this issue head on in his provocative piece, “Among the Disrupted.” Here is part of what he shared:

“Amid the bacchanal of disruption, let us pause to honor the disrupted. The streets of American cities are haunted by the ghosts of bookstores and record stores, which have been destroyed by the greatest thugs in the history of the culture industry. Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind. Everybody talks frantically about media, a second-order subject if ever there was one, as content disappears into “content.” What does the understanding of media contribute to the understanding of life? Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes: Digital expectations of alacrity and terseness confer the highest prestige upon the twittering cacophony of one-liners and promotional announcements. It was always the case that all things must pass, but this is ridiculous.”

But it was what he said about the growing reverence for data that he sees is morphing into near worship that is most telling:

“Meanwhile the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms: Economists are our experts on happiness! Where wisdom once was, quantification will now be. Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology. The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past. Beyond its impact upon culture, the new technology penetrates even deeper levels of identity and experience, to cognition and to consciousness. Such transformations embolden certain high priests in the church of tech to espouse the doctrine of “transhumanism” and to suggest, without any recollection of the bankruptcy of utopia, without any consideration of the cost to human dignity, that our computational ability will carry us magnificently beyond our humanity and “allow us to transcend these limitations of our biological bodies and brains. . . . There will be no distinction, post-Singularity, between human and machine.” (The author of that updated mechanistic nonsense is a director of engineering at Google.)”

Interested in learning more? You can read the complete article here.

Online…Really?

11friedmanWeb-master768

Tom Friedman has a unique talent for spotting trends that impact our lives. So it was inevitable that he would talk about the sea change in our lives as we balance what we do in the “real” world with what we do in cyberspace. He even challenges the notion of what is real and what isn’t. While some pundits complain about this change, he embraces it. In his own words;

In 2016 we reached a tipping point. It was the moment when we realized that a critical mass of our lives and work had shifted away from the terrestrial world to a realm known as “cyberspace.” That is to say, a critical mass of our interactions had moved to a realm where we’re all connected but no one’s in charge.

After all, there are no stoplights in cyberspace, no police officers walking the beat, no courts, no judges, no God who smites evil and rewards good, and certainly no “1-800-Call-If-Putin-Hacks-Your-Election.” If someone slimes you on Twitter or Facebook, well, unless it is a death threat, good luck getting it removed, especially if it is done anonymously, which in cyberspace is quite common.

And yet this realm is where we now spend increasing hours of our day. Cyberspace is now where we do more of our shopping, more of our dating, more of our friendship-making and sustaining, more of our learning, more of our commerce, more of our teaching, more of our communicating, more of our news-broadcasting and news-seeking and more of our selling of goods, services and ideas.

Interested in this tipping point? You can read the complete article here.

Real Things

06booksax-blog427

We live in a digital world. But analog is making a comeback. Some of us sense it. Some of us feel it, but as Michiko Kakutani explains in his review of David Sax’s book, The Revenge of Analog, there are compelling reasons for this. He begins by saying, as Stephen King once wrote, “Sooner or later, everything old is new again,”

Here’s part of what’s in this killer-good review:

“In his captivating new book, “The Revenge of Analog,” the reporter David Sax provides an insightful and entertaining account of this phenomenon, creating a powerful counternarrative to the techno-utopian belief that we would live in an ever-improving, all-digital world. Mr. Sax argues that analog isn’t going anywhere, but is experiencing a bracing revival that is not just a case of nostalgia or hipster street cred, but something more complex.”

“Analog experiences can provide us with the kind of real-world pleasures and rewards digital ones cannot,” he writes, and “sometimes analog simply outperforms digital as the best solution.” Pen and paper can give writers and designers a direct means of sketching out their ideas without the complicating biases of software, while whiteboards can bring engineers “out from behind their screens” and entice them “to take risks and share ideas with others.”

You can read the complete review here

Our Online World

0911-bks-ronson-blog427

Who are you? Are there two yous? Increasingly, we have two personas, one we exhibit in person and an entirely different one. This is something that is all new in our society, and something brought on by the internet. Our personas and our behavior does change when we go online. A ground-breaking book helps us understand way.

Jon Ronson titles his review of Mary Akind’s book, “The Cyber Effect,” “Offline Jekylls, Online Hydes,” and it is apt, because while in person we typically adhere to what we feel “polite society” demands, offline we often are different people, sometimes people who are unrecognizable to our family and close friends. Here is part of what Ronson offers:

“This is her provocative and at times compelling thesis: The internet — “the largest unregulated social experiment of all time,” in the words of the clinical psychologist Michael Seto — is turning us, as a species, more mentally disordered, anxious, obsessive, narcissistic, exhibitionist, body dysmorphic, psychopathic, schizophrenic. All this might unleash a “surge in deviant, criminal and abnormal behavior in the general population.” We check our mobile devices 1,500 times a week, sometimes even secretly, before the plane’s pilot tells us it’s safe. Our ethics have become so impaired that some of us take selfies in front of people threatening to jump from bridges. (Having spent years with people disproportionately shamed on social media for some minor transgression, I can attest to how the internet can rob people of empathy.)”

“She paints an evocative image of sitting on a train to Galway, watching a woman breast-feed her baby: “The baby was gazing foggily upward . . . looking adoringly at the mother’s jaw, as the mother continued to gaze adoringly at her device.” How will such a seemingly tiny behavioral shift like less eye contact between mother and baby play out over time? Aiken asks. “This small and simple thing, millions of babies around the world getting less eye contact and less one-on-one attention, could result in an evolutionary blip. Yes, I said it. Evolutionary blip. Less eye contact could change the course of human civilization.””

Thought provoking? You can read the full article here

Mr. Moore’s “Law”

mainframe computer

Who could have predicted where technology – and especially where the high-technology that makes it possible for us to have cell phones and tablets and surf the internet – would take us today. Most of us would have lost our shirts if we placed wagers on technology’s future trajectory – but it wouldn’t because we had wild dreams, but because our view was so limited.

One pioneer did have that vision – over a half-century ago. In 1965, Gordon Moore gave us what we refer to today as Moore’s Law. Here’s part of what Michael Malone had to say in the Wall Street Journal article:

Fifty years ago, on April 19, 1965, chemist and reluctant entrepreneur Gordon E. Moore set out to graph the rapid rate of improvement in semiconductor-chip performance—and ended up discovering the heartbeat of the modern world.

That discovery is what became known as “Moore’s Law,” which is the observation that performance (speed, price, size) of integrated circuits, aka microchips, regularly doubled every 18 months. The graph began as an illustration to an article in Electronics magazine, and it didn’t acquire the name Moore’s Law” for another decade. And for a decade after that it remained a topic of interest mostly inside the semiconductor industry.

Moore’s Law became iconic not because of its novelty or through media promotion, but because it has proved to be the most effective predictive tool of new chip generations, technology innovation and even social and cultural changes of the last half-century. It achieved all this despite doubts about its durability even by Mr. Moore and the fact that it isn’t really a scientific law.

Yet against all odds and regular predictions of its imminent demise, Moore’s Law endures. The historic doubling of chip performance every 18 months may have slowed to half that pace today, but even pessimists now accept that we likely will live under its regime well into the next decade and beyond.

Read more about how this “law” has changed our lives here.

Digital World

17BOOKKOBEK-master180

Do we control our digital world, or does someone else. It seems that we have choices, but do we really?

I recently read I Hate the Internet. Whew. It really made me think about the subject of technology and our lives. And it’s a NOVEL.

If you don’t have time to read the entire book, I’ve posted a link to a New York Times review below.

In his new novel, “I Hate the Internet,” Jarett Kobek performs a similar maneuver on the viscera of the American psyche, at least as regards the so-called information highway. I can’t decide if, on his way down, Mr. Kobek is laughing or weeping.

Here is a key thought. One of the curious aspects of the 21st century was the great delusion amongst many people, particularly in the San Francisco Bay Area, that freedom of speech and freedom of expression were best exercised on technology platforms owned by corporations dedicated to making as much money as possible.

Read more here

More Than the Internet

02coder-web-blog427-v3

Legions of companies, CEOs, tech workers and even laypeople are looking for “the next big thing in technology. We spend time and considerable money at retreats, seminars and the like to try to find that new, new thing. For some, it is an obsession, for others an avocation, and for most of us, at least a curiosity. For me, I try to say plugged into what Adam Bosworth is doing.

If you have sent email on Google or used Microsoft’s browser or databases, you have touched the technology handiwork of Adam Bosworth. He has a penchant for being intimately involved in the creation of generations of widely used technology.

While it is never easy to predict what the next big thing will be, identifying what Mr. Bosworth is working on is always good for clues. Right now, along with competitors at companies like Amazon and Google, he is building what some call a “data singularity.”

Imagine if almost everything — streets, car bumpers, doors, hydroelectric dams — had a tiny sensor. That is already happening through so-called Internet-of-Things projects run by big companies like General Electric and IBM.

Think of it as one, enormous process in which machines gather information, learn and change based on what they learn – all in seconds.

Read more about the next big thing(s) here:

Moore’s Law – Onward!

27-MOORE-master675

I’ve used this blog to talk about technology and to see where technology is pushing the edges of the envelope. Technology continues to dazzle us. But as Arthur Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic.”

A giant in the history of the computer – indeed of the world we know today – died this month. His name was Gordon Moore.

At the inaugural International Solid-State Circuits Conference held on the campus of the University of Pennsylvania in Philadelphia in 1960, a young computer engineer named Douglas Engelbart introduced the electronics industry to the remarkably simple but groundbreaking concept of “scaling.”

Dr. Engelbart, who would later help develop the computer mouse and other personal computing technologies, theorized that as electronic circuits were made smaller, their components would get faster, require less power and become cheaper to produce — all at an accelerating pace.

Sitting in the audience that day was Gordon Moore, who went on to help found the Intel Corporation, the world’s largest chip maker. In 1965, Dr. Moore quantified the scaling principle and laid out what would have the impact of a computer-age Magna Carta. He predicted that the number of transistors that could be etched on a chip would double annually for at least a decade, leading to astronomical increases in computer power.

His prediction appeared in Electronics Magazine in April 1965 – over a half-century ago – and was later called Moore’s Law. It was never a law of physics, but rather an observation about the economics of a young industry that ended up holding true for a half-century.

Read more in this thoughtful NYT piece here

 

Are We the Masters or Servants?

W2nA69u

I’ve used this blog to talk about technology and to see where technology is pushing the edges of the envelope and not just delivering a handier Twitter app. Technology continues to dazzle us. But as Arthur Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic.”

But sometimes this “magic” has made many people wonder and filled others with fear. Few people can forget the chilling scene from Stanley Kubrick’s 2001: A Space Odyssey: Astronauts David Bowman and Frank Poole consider disconnecting HAL’s (Heuristically programmed ALgorithmic computer) cognitive circuits when he appears to be mistaken in reporting the presence of a fault in the spacecraft’s communications antenna. They attempt to conceal what they are saying, but are unaware that HAL can read their lips. Faced with the prospect of disconnection, HAL decides to kill the astronauts in order to protect and continue its programmed directives.

Are we still the masters of technology – or are we the servants? Movies like Ex Machina can make us wonder. Unfortunately, there is often more heat than light on the subject.

 

Read more in this thoughtful NYT Magazine piece here:

http://www.nytimes.com/2013/10/13/magazine/all-is-fair-in-love-and-twitter.html?_r=0