More Than the Internet

02coder-web-blog427-v3

Legions of companies, CEOs, tech workers and even laypeople are looking for “the next big thing in technology. We spend time and considerable money at retreats, seminars and the like to try to find that new, new thing. For some, it is an obsession, for others an avocation, and for most of us, at least a curiosity. For me, I try to say plugged into what Adam Bosworth is doing.

If you have sent email on Google or used Microsoft’s browser or databases, you have touched the technology handiwork of Adam Bosworth. He has a penchant for being intimately involved in the creation of generations of widely used technology.

While it is never easy to predict what the next big thing will be, identifying what Mr. Bosworth is working on is always good for clues. Right now, along with competitors at companies like Amazon and Google, he is building what some call a “data singularity.”

Imagine if almost everything — streets, car bumpers, doors, hydroelectric dams — had a tiny sensor. That is already happening through so-called Internet-of-Things projects run by big companies like General Electric and IBM.

Think of it as one, enormous process in which machines gather information, learn and change based on what they learn – all in seconds.

Read more about the next big thing(s) here:

Moore’s Law – Onward!

27-MOORE-master675

I’ve used this blog to talk about technology and to see where technology is pushing the edges of the envelope. Technology continues to dazzle us. But as Arthur Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic.”

A giant in the history of the computer – indeed of the world we know today – died this month. His name was Gordon Moore.

At the inaugural International Solid-State Circuits Conference held on the campus of the University of Pennsylvania in Philadelphia in 1960, a young computer engineer named Douglas Engelbart introduced the electronics industry to the remarkably simple but groundbreaking concept of “scaling.”

Dr. Engelbart, who would later help develop the computer mouse and other personal computing technologies, theorized that as electronic circuits were made smaller, their components would get faster, require less power and become cheaper to produce — all at an accelerating pace.

Sitting in the audience that day was Gordon Moore, who went on to help found the Intel Corporation, the world’s largest chip maker. In 1965, Dr. Moore quantified the scaling principle and laid out what would have the impact of a computer-age Magna Carta. He predicted that the number of transistors that could be etched on a chip would double annually for at least a decade, leading to astronomical increases in computer power.

His prediction appeared in Electronics Magazine in April 1965 – over a half-century ago – and was later called Moore’s Law. It was never a law of physics, but rather an observation about the economics of a young industry that ended up holding true for a half-century.

Read more in this thoughtful NYT piece here

 

Are We the Masters or Servants?

W2nA69u

I’ve used this blog to talk about technology and to see where technology is pushing the edges of the envelope and not just delivering a handier Twitter app. Technology continues to dazzle us. But as Arthur Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic.”

But sometimes this “magic” has made many people wonder and filled others with fear. Few people can forget the chilling scene from Stanley Kubrick’s 2001: A Space Odyssey: Astronauts David Bowman and Frank Poole consider disconnecting HAL’s (Heuristically programmed ALgorithmic computer) cognitive circuits when he appears to be mistaken in reporting the presence of a fault in the spacecraft’s communications antenna. They attempt to conceal what they are saying, but are unaware that HAL can read their lips. Faced with the prospect of disconnection, HAL decides to kill the astronauts in order to protect and continue its programmed directives.

Are we still the masters of technology – or are we the servants? Movies like Ex Machina can make us wonder. Unfortunately, there is often more heat than light on the subject.

 

Read more in this thoughtful NYT Magazine piece here:

http://www.nytimes.com/2013/10/13/magazine/all-is-fair-in-love-and-twitter.html?_r=0

High Tech Nirvana?

0131-BKS-CVR-master675

Like many of you, I’m a big believer in technology, especially high-tech that springs from the big brains in Silicon Valley and elsewhere. But here’s the question: Will the best brains of the future build things resembling past innovations like cars and electricity or will they spend all their time making Twitter more user-friendly?

It’s worth asking: are the strides we are seeing in high-technology today really going to change our lives that profoundly and usher-in the same kind of life-altering changes past technology revolutions have. Many think it will. But without being a “techno-phoebe,” Robert Gordon takes a different view, and his arguments are compelling.

His new book, The Rise and Fall of American Growth, takes a thoughtful look at previous revolutions and without dismissing today’s tech revolution, and looks at how truly life-changing previous revolutions were. Here is part of what noted economist Paul Krugman says in his review of Gordon’s 762-page book:

I was fascinated by Gordon’s account of the changes wrought by his Great Inventions. As he says, “Except in the rural South, daily life for every American changed beyond recognition between 1870 and 1940.” Electric lights replaced candles and whale oil, flush toilets replaced outhouses, cars and electric trains replaced horses. (In the 1880s, parts of New York’s financial district were seven feet deep in manure.)

Meanwhile, backbreaking toil both in the workplace and in the home was for the most part replaced by far less onerous employment. This is a point all too often missed by economists, who tend to think only about how much purchasing power people have, not about what they have to do to get it, and Gordon does an important service by reminding us that the conditions under which men and women labor are as important as the amount they get paid.

Aside from its being an interesting story, however, why is it important to study this transformation? Mainly, Gordon suggests — although these are my words, not his — to provide a baseline. What happened between 1870 and 1940, he argues, and I would agree, is what real transformation looks like. Any claims about current progress need to be compared with that baseline to see how they measure up.

And it’s hard not to agree with him that nothing that has happened since is remotely comparable. Urban life in America on the eve of World War II was already recognizably modern; you or I could walk into a 1940s apartment, with its indoor plumbing, gas range, electric lights, refrigerator and telephone, and we’d find it basically functional. We’d be annoyed at the lack of television and Internet — but not horrified or disgusted.

By contrast, urban Americans from 1940 walking into 1870-style accommodations — which they could still do in the rural South — were indeed horrified and disgusted. Life fundamentally improved between 1870 and 1940 in a way it hasn’t since.

Something to think about as we pin our hopes for the future on today’s emerging technology.

You can read Krugman’s full review here:

http://www.nytimes.com/2016/01/31/books/review/the-powers-that-were.html?_r=0

Does Technology Change the World?

1228-bks-GERTNER-sub-1-master315

Does technology change the world? And is it the “lone genius” who gives us these gifts? These are huge questions for all of us. Here is what Jon Gertner shares in “Unforeseeable Consequences:”

At various points in “How We Got to Now,” Steven Johnson helps us see how innovation is almost never the result of a lone genius experiencing a sudden voilà! moment; it’s a complex process involving a dizzying number of inputs, individuals, setbacks and (sometimes) accidents. Also, it’s hardly the exclusive domain of private-sector entrepreneurs. Important ideas are often driven by academics, governments and philanthropists.

Above all, though, technological histories like this help us reckon with how much we miss by focusing too exclusively on economic, cultural and political history. Not that any one domain is superior to another — only that Johnson proves you can’t explain one without the others. He does seem to suggest that technological history may have an advantage in one regard: It not only helps readers better see where we’ve been, but urges us to think harder about where we’re going.
Read the entire article here

Drone Wars

One of the most innovative technologies used anywhere – and especially in our military – is unmanned or autonomous systems, sometimes called “drones.” These military drones have been talked about a great deal in all media, especially armed drones which are often operated by the U.S. intelligence agencies or our military to take out suspected terrorists.

Few security issues are more controversial. In an effort to shed some light in an area where there is mostly heat, I published an article with Faircount Media entitled “The Other Side of Autonomy.”

A few salient quotes from this article capture the controversy surrounding “drone wars.”

In an article entitled, “Morals and the Machine,” The Economist addressed the issue of autonomy and humans-in-the-loop this way:

As they become smarter and more widespread, autonomous machines are bound to end up making life-or-death decisions in unpredictable situations, thus assuming—or at least appearing to assume—moral agency. Weapons systems currently have human operators “in the loop”, but as they grow more sophisticated, it will be possible to shift to “on the loop” operation, with machines carrying out orders autonomously. As that happens, they will be presented with ethical dilemmas…More collaboration is required between engineers, ethicists, lawyers and policymakers, all of whom would draw up very different types of rules if they were left to their own devices.

Bill Keller put the issue of autonomy for unmanned systems this way in his Op-ed, “Smart Drones,” in the New York Times in March 2013:

If you find the use of remotely piloted warrior drones troubling, imagine that the decision to kill a suspected enemy is not made by an operator in a distant control room, but by the machine itself. Imagine that an aerial robot studies the landscape below, recognizes hostile activity, calculates that there is minimal risk of collateral damage, and then, with no human in the loop, pulls the trigger. Welcome to the future of warfare. While Americans are debating the president’s power to order assassination by drone, powerful momentum – scientific, military and commercial – is propelling us toward the day when we cede the same lethal authority to software.

Looking ahead to this year and beyond, it is clear that “drone warfare” will continue to be extremely controversial. Stay tuned!

Click here to read the entire article (PDF)

Data Drones

07oped-master675-v2

Where does data fit in your life? Do you use it? Do you ignore it? Does it dominate your life? What about “big data?”

Big data is suddenly everywhere. Everyone seems to be collecting it, analyzing it, making money from it and celebrating (or fearing) its powers. Whether we’re talking about analyzing zillions of Google search queries to predict flu outbreaks, or zillions of phone records to detect signs of terrorist activity, or zillions of airline stats to find the best time to buy plane tickets, big data is on the case. By combining the power of modern computing with the plentiful data of the digital era, it promises to solve virtually any problem — crime, public health, the evolution of grammar, the perils of dating — just by crunching the numbers.

Or so its champions allege. “In the next two decades,” the journalist Patrick Tucker writes in the latest big data manifesto, “The Naked Future,” “we will be able to predict huge areas of the future with far greater accuracy than ever before in human history, including events long thought to be beyond the realm of human inference.” Statistical correlations have never sounded so good.

Read more here

 

Tech’s Future

A Vision of the Future From Those Likely to Invent It   NYTimes.com

What far-off technology will be commonplace in a decade? Our tech-savvy – some would say tech-crazed – culture always seems to want to know what the next big thing will be.

From employment to leisure and transportation to education, tech is changing the world at a faster pace than ever before. Already, people wear computers on their faces, robots scurry through factories and battlefields and driverless cars dot the highway that cuts through Silicon Valley. Almost two-thirds of Americans think technological change will lead to a better future, while about one-third think people’s lives will be worse as a result, according to a new survey from Pew Research Center. Regardless, expect more change. In a series of interviews, which have been condensed and edited, seven people who are driving this transformation provided a glimpse into the not-too-distant future.

Read more here

Who Rules the Tech Economy?

steve jobs

Where will technology flourish tomorrow? Michael Malone tells us where in his hard-hitting WSJ piece: “Why Silicon Valley Will Continue to Rule the Tech Economy.”

Silicon Valley, especially its San Francisco wing, is richer and more powerful than ever. Yet there are growing murmurs—underscored by plateauing new-jobs numbers and housing prices, street protests in San Francisco over the new ‘plutocrats,’ the lack of exciting new products and a decline of early-stage new investments—that Silicon Valley has finally peaked and begun the downhill slide to irrelevance.

Slide? Perhaps. The Valley has always been characterized by a four-year boom-bust cycle, and the electronics industry is overdue for such a downturn. Yet there is very good reason to believe that not only will the Valley return bigger and stronger than ever, but that it will further consolidate its position against all comers as the World’s High Tech Capital.

Read more here on why:

http://online.wsj.com/articles/michael-malone-why-silicon-valley-will-continue-to-rule-the-tech-economy-1408747795?KEYWORDS=Michael+S+Malone

The Wired World

Internet_map_1024

The Internet as we know it has only been around for a generation.  Pretty much everyone in the industrialized world takes the Web for granted by now, as it has become a ubiquitous component of business, government, and our social lives.  Yet, most of us probably don’t give much thought to the 77 undersea fiber optic cables spanning nearly a million kilometers that carry 99% of the world’s communications and data. Fortunately, one website has done the service of providing not only an amazing visualization of this vital, yet vulnerable infrastructure, but tied its importance to naval history and military operations.  Even the Air Force’s UAV operations rely on undersea cables to reduce the latency inherent in satellite networks. Built Visible’s Messages in the Deep page serves as a reminder that our daily lives remain inextricably linked to the oceans. (for even more details, see Submarine Cable Map). This overlooked aspect of sea power facilitates global commerce today much as surface shipping has done for centuries.