Category Archives: Technology

The Interconnected World and Martin Luther?

Some thoughts about how things change, and yet how much stays the same…
By Niall Ferguson
Originally published in The Sunday Times, October 1, 2017

Just as Martin Luther’s utopian vision and the invention of the printing press led to an era of religious war and turmoil, the internet, hailed as a portal to a better world, is threatening democracy

The hyperconnected world was not supposed to be like this. In May, Evan Williams, one of the founders of Twitter, told The New York Times: “I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place. I was wrong about that.”

In September Sheryl Sandberg, Facebook’s chief operating officer, acknowledged that the company’s online tools had allowed advertisers to target self-described “Jew haters”. “We never intended or anticipated this functionality being used this way,” she admitted, “and that is on us.”

Surprise! The men and women who built the internet-based social networks that have so transformed our lives thought everything would be awesome if only we could all be connected. Speaking at Harvard’s degree ceremony in May, Facebook’s co-founder and chief executive, Mark Zuckerberg, looked back on his undergraduate ambition to “connect the whole world”. “This idea was so clear to us,” he recalled, “that all people want to connect . . . My hope was never to build a company, but to make an impact.”

Facebook certainly made an impact last year, but not quite the impact the young Zuckerberg had in mind in his Harvard dorm. A committed believer in globalisation who tends to wear his liberal politics on his T-shirt sleeve, Zuckerberg is reeling. Not only did the masterminds behind the Brexit and Trump campaigns successfully use Facebook advertising to hone and target their ultimately victorious campaign messages; worse, the Russian government appears to have used Facebook in the same way, seeking to depress voter support for Hillary Clinton. Worse still, neo-Nazis seem to have been using the social network to spread their own distinctive brand of hate.

Yet the architects of the biggest social networks to have existed should not have been surprised. If he had studied history at Harvard rather than psychology and computer science, Zuckerberg might have foreseen the ways in which Facebook and its ilk would be used and abused.

Five hundred years ago this year, Martin Luther sent his critique of corrupt church practices as a letter to the Archbishop of Mainz. It is not wholly clear if Luther also nailed a copy to the door of All Saints’ Church, Wittenberg, but it scarcely matters. Thanks to the invention of the printing press by Johannes Gutenberg, that mode of publishing had been superseded.

Before 1517 was out, versions of Luther’s original Latin text had been printed in Basel, Leipzig and Nuremberg. By the time Luther was officially condemned as a heretic by the Edict of Worms in 1521, his writings were all over German-speaking Europe. In the course of the 16th century, German printers produced almost 5,000 editions of Luther’s works.

Luther’s vision was utopian. Just as Zuckerberg today dreams of creating a single “global community”, so Luther believed that his Reformation would produce a “priesthood of all believers”, all reading the Bible, all in a direct relationship to the one, true God.

It didn’t turn out that way. The Reformation unleashed a wave of religious revolt against the authority of the Roman Catholic Church. As it spread from reform-minded clergymen and scholars to urban elites to illiterate peasants, it threw first Germany and then all of northwestern Europe into turmoil.

In 1524 a full-blown peasants’ revolt broke out. By 1531 there were enough Protestant princes to form an alliance (the Schmalkaldic League) against the Holy Roman Emperor, Charles V. Although defeated, the Protestants were powerful enough to preserve the Reformation in a patchwork of territories.

Religious conflict erupted again in the Thirty Years’ War, a conflict that turned central Europe into a charnel house. Especially in northwestern Europe – in England, Scotland and the Dutch Republic – it proved impossible to re-establish Roman Catholicism, even when Rome turned the technologies and networking strategy of the Reformation against it, in addition to the more traditional array of cruel tortures and punishments that had long been the church’s forte.

The global impact of the internet has few analogues in history better than the impact of printing on 16th-century Europe. The personal computer and smartphone have empowered networks as much as the pamphlet and the book did in Luther’s time.

Indeed, the trajectories for the production and price of PCs in America between 1977 and 2004 are remarkably similar to the trajectories for the production and price of printed books in England from 1490 to 1630.

In the era of the Reformation and thereafter, connectivity was enhanced exponentially by rising literacy, so that a growing share of the population was able to access printed literature of all kinds, rather than having to rely on orators and preachers to convey new ideas to them.

There are three major differences between our networked age and the era that followed the advent of European printing. First, and most obviously, our networking revolution is much faster and more geographically extensive than the wave of revolutions unleashed by the German printing press.

In a far shorter space of time than it took for 84% of the world’s adults to become literate, a remarkably large proportion of humanity has gained access to the internet. As recently as 1998 only about 2% of the world’s population were online. Today the proportion is two in five. The pace of change is roughly an order of magnitude faster than in the post-Gutenberg period: what took centuries after 1490 took just decades after 1990.

[This is just unbelievable.] Google started life in a garage in Menlo Park, California, in 1998. Today it has the capacity to process more than 4.2bn search requests every day. In 2005 YouTube was a start-up in a room above a pizzeria in San Mateo. Today it allows people to watch 8.8bn videos a day. Facebook was dreamt up at Harvard just over a decade ago. Today it has more than 2bn users who log on at least once a month.

The scale of Facebook’s success is especially staggering. Two-thirds of American adults are Facebook users. Just under half get their news from Facebook.

It used to be said that there were six degrees of separation between any two individuals on the planet – say, between yourself and Monica Lewinsky. On Facebook there are just 3.57 degrees of separation, meaning that any two of the 2bn Facebook users can get in touch by taking fewer than four steps through the network. The world is indeed connected as never before. We are all friends of friends of friends of friends.

Second, the distributional consequences of our revolution are quite different from those of the early-modern revolution. Early modern Europe was not an ideal place to enforce intellectual property rights, which in those days existed only when technologies could be secretively monopolised by a guild. The printing press created no billionaires.

Johannes Gutenberg was not Bill Gates (indeed, by 1456 he was effectively bankrupt). Moreover, only a subset of the media made possible by the printing press – the newspapers and magazines invented in the 18th century – sought to make money from advertising, whereas all the most important ones made possible by the internet do. Few people foresaw that these giant networks would be so profoundly inegalitarian.

To be sure, innovation has driven down the costs of information technology. Globally, the costs of computing and digital storage fell at annual rates of, respectively, 33% and 38% between 1992 and 2012. Everyone has benefited from that. However, oligopolies have developed in the realms of both hardware and software, as well as service provision and wireless networks.

The ownership of the world’s electronic network is extraordinarily concentrated. Google (or rather the renamed parent company, Alphabet Inc) is worth $669bn by market capitalisation. About 16% of its shares, worth around $106bn, are owned by its founders, Larry Page and Sergey Brin. The market capitalisation of Facebook is approaching $500bn; 475 million of the shares, worth about $81bn, are owned by its T-shirt-loving founder.

Unlike in the past, there are now two kinds of people in the world: those who own and run the networks, and those who merely use them.

Third, the printing press had the effect of disrupting religious life in western Christendom before it disrupted anything else. By contrast, the internet began by disrupting commerce; only very recently did it begin to disrupt politics, and it has really disrupted only one religion, namely Islam.

The political disruption reached a climax last year, when social networks helped to topple David Cameron in the Brexit referendum and to defeat Hillary Clinton in the US presidential election.

In the American case, a number of networks were operating. There was the grassroots network of support that the Trump campaign built – and that built itself – on the platforms of Facebook and Twitter. These were the “forgotten” men and women who turned out on November 8 to defeat the “failed and corrupt political establishment” that Trump’s opponent was said to personify.

A role was also played by the jihadist network, as the Isis-affiliated terror attacks during the election year lent credibility to Trump’s pledges to “strip out the support networks for radical Islam” and to ban Muslim immigration.

Yet in two respects there is a clear similarity between our time and the revolutionary period that followed the advent of printing. Like the printing press, modern information technology is transforming not only the market – most recently, by facilitating the sharing of cars and homes – but also the public sphere. Never before have so many people been connected in an instantly responsive network through which “memes” can spread even more rapidly than natural viruses.

But the notion that taking the whole world online would create a utopia of netizens, all equal in cyber-space, was always a fantasy – as much a delusion as Luther’s vision of a “priesthood of all believers”. The reality is that the global network has become a transmission mechanism for all kinds of manias and panics, just as the combination of printing and literacy for a time increased the prevalence of millenarian sects and witch crazes. The cruelties of Isis seem less idiosyncratic when compared with those of some governments and sects in the 16th and 17th centuries.

Second, our time is seeing an erosion of territorial sovereignty. In the 16th and 17th centuries, Europe was plunged into a series of religious wars. Spain and France tried by fair means and foul to bring England back to the Roman Catholic fold. As late as 1745, a French-backed army of Scottish Highlanders invaded England with a view to restoring the old faith in the British Isles.

In the 21st century, we see a similar phenomenon of escalating intervention in the domestic affairs of sovereign states. There was, after all, a further network involved in the US election of 2016, and that was Russia’s intelligence network.

It is clear that the Russian government did its utmost to maximise the damage to Clinton’s reputation stemming from her and her campaign’s sloppy email security, using WikiLeaks as the conduit through which stolen documents were passed to the western media. Russian hackers and trolls last year posed a threat to American democracy similar to the one that Jesuit priests posed to the English Reformation: a threat from within sponsored from without.

Leave aside the question of whether or not the Russian interference decided the election in favour of Trump; suffice to say it helped him, though both fake and real news damaging to Clinton was also disseminated without Russia’s involvement. Leave aside, too, the as yet unresolved questions of how many members of the Trump campaign were complicit in the Russian operation, and how much they knew.

The critical point is Facebook itself may have decided the outcome of an election that would have gone the other way if about 40,000 voters in just three states had chosen Clinton over Trump.

No, it wasn’t meant to be this way. This was not what Silicon Valley envisaged when it set out to build “a planet where everything is connected” – the motto of Eric Schmidt’s foundation.

But then Luther didn’t set out to cause 130 years of bloody religious warfare either.

© Niall Ferguson 2017

Extracted from The Square and the Tower: Networks, Hierarchies and the Struggle for Global Power by Niall Ferguson


The Computer that could Rule the World

This makes me nervous.
WSJ – 10/28/2017
By Arthur Herman

During World War II the federal government launched the Manhattan Project to ensure the U.S. would possess the first atomic bomb. Seventy-five years later, America is in another contest just as vital to national security, the economy and even the future of liberal democracy. It’s the race to build the first fully operational quantum computer.

America’s leading adversaries are working urgently to develop such a computer, which uses the principles of quantum mechanics to operate on data exponentially faster than traditional computers. Such a system theoretically would have enough computing power to open the encrypted secrets of every country, company and person on the planet. It would also enable a foreign creator to end America’s dominance of the information- technology industry and the global financial system.

How does quantum computing work? In the bizarre world of quantum mechanics, electrons and photons can be in two states at once. All current computers process data in a linear sequence of one and zeros. Every bit, the smallest unit of data, has to be either a zero or a one. But a quantum bit, or “qubit,” can be a zero and a one at the same time, and do two computations at once. Add more qubits, and the computing power grows exponentially. This will allow quantum computers of the future to solve problems thousands of times as fast as today’s fastest supercomputer.

This poses a problem for most encryption systems, because they are based on math problems that would take a conventional computer centuries to solve. The encryption that protects credit-card information and bank accounts, for instance, relies on two keys. One is the “private key,” which consists of two large prime numbers only known to the bank. The “public key” sits in cyberspace and is the product of multiplying together the two “private” primes to create a semiprime. The only way a hacker could access encrypted credit card or bank information would be by factorizing or breaking down the large “public key”—often 600 digits or longer—back to the correct two numbers of the “private key.” This Herculean task simply takes too long for current computers.

A future quantum computer will be able to decrypt such systems almost instantaneously. Even Blockchain will not be able to withstand the first quantum attack if it relies on two-key encryption architecture, which protects nearly all digital information today. To understand the scale of the threat, imagine a thousand Equifax breaches happening at once.

As a September article in the journal Nature noted: “Many commonly used cryptosystems will be completely broken once large quantum computers exist.” Most quantum experts believe that such a breakthrough may only be a decade away. If quantum computers will hold the key to the global future, the U.S. needs to secure that key.

Scientists already know that quantum computing is possible. The problem now is engineering a system that takes full advantage of its potential. Since subatomic particles are inherently unstable, assembling enough qubits to do calculations takes persistence, time and resources. Quantum computers with 10 qubits already exist. A quantum computer capable of solving problems that would stump a classical computer is close at hand. Fifty qubits will mark the threshold of quantum supremacy.

Other countries understand that. While most of the work on quantum computing in the U.S. is being done by companies like Google and Microsoft, the European Union has made quantum research a flagship project over the next 10 years and is committed to investing nearly €1 billion in the effort. Australia, the U.K. and Russia have entered the quantum race, too.

But the real national leader in quantum research investment is China. This

summer it launched the first satellite capable of transmitting quantum data. It’s building the world’s largest quantum research facility to develop a quantum computer specifically for code-breaking and supporting its armed forces, with quantum navigation systems for stealth submarines. Beijing is investing around $10 billion in the facility, which is to be finished in 2½ years.

Today the U.S. government spends only $200 million a year on quantum research of all kinds, spread haphazardly over a variety of agencies— from the National Security Agency to the Energy Department.

While IBM recently set a new benchmark with its 17-qubit processor, and Google insists it will reach the 50-qubit threshold before the end of this year, China is steadily advancing toward a 40-qubit prototype— and remains determined to reach “quantum supremacy.” At the same time, countries will need to revamp their encryption systems to keep up with the new quantum reality.

The U.S. can achieve both goals through a new Manhattan Project. Call it the National Quantum Initiative. Like its atomic predecessor, the new program should marshal federal government money, the efficiencies of private industry, and the intellectual capital of the nation’s laboratories and universities, while keeping everyone focused on the essential mission: winning the quantum race.

The Manhattan Project cost some $30 billion in today’s dollars. In comparison, the National Photonics Initiative has called for an additional $500 million of federal funding over five years to help the U.S. secure its grip on quantum supremacy.

Recognizing this, Congress held its first hearings on a national initiative for quantum computing on Oct. 24. Congressional leaders should now pass a bill funding a National Quantum Initiative.

Equally important is to make sure that America’s financial system, critical infrastructure and national-security agencies are fully quantum resistant. Companies and labs are currently developing algorithms and tamper-proof encryption based on quantum technology. But without a concerted and coherent national effort, it will take years for government and industry to agree on the standards for quantum-safe replacements for today’s encryption methods, and to make sure they are deployed in time to prevent a quantum attack. In a world of quantum proliferation, the risks are too great to ignore.

Since the end of World War II, the U.S. has led the world in nuclear research, making this country stronger and safer. For three decades the U.S. has been the leader in information technology, which has made Americans more innovative and prosperous. The U.S. cannot afford to lose that leadership now—not when the future hangs in the quantum balance.

Mr. Herman, a senior fellow at the Hudson Institute, is author of “1917: Lenin, Wilson, and the Birth of the New World Disorder,” forthcoming from HarperCollins in November.


Fake News and the Digital Duopoly

If you think Google, Facebook, or even LinkedIn, are ‘charities’ and primarily concerned with the task of “serving mankind” I would say you are deluded.
By Robert Thomson    WSJ 4/5/2017

‘Fake news” has seemingly, suddenly, become fashionable. In reality, the fake has proliferated for a decade or more, but the faux, the flawed and the fraudulent are now pressing issues because the full scale of the changes wrought upon the integrity of news and advertising by the digital duopoly—Google and Facebook—has become far more obvious.

Google’s commodification of content knowingly, willfully undermined provenance for profit. That was followed by the Facebook stream, with its journalistic jetsam and fake flotsam. Together, the two most powerful news publishers in human history have created an ecosystem that is dysfunctional and socially destructive.

Both companies could have done far more to highlight that there is a hierarchy of content, but instead they have prospered mightily by peddling a flat-earth philosophy that doesn’t distinguish between the fake and the real because they make copious amounts of money from both.

Depending on which source you believe, they have close to two-thirds of the digital advertising market— and let me be clear that we compete with them for that share. The Interactive Advertising Bureau estimates they accounted for more than 90% of the incremental increase in digital advertising over the past year.

The only cost of content for these companies has been lucrative contracts for lobbyists and lawyers, but the social cost of that strategy is far more profound.

It is beyond risible that Google and its subsidiary YouTube, which have earned many billions of dollars from other people’s content, should now be lamenting that they can’t possibly be held responsible for monitoring that content. Monetizing yes, monitoring no—but it turns out that free money does come at a price.

We all have to work with these companies, and we are hoping, mostly against hope, that they will finally take meaningful action, not only to allow premium content models that fund premium journalism, but also to purge their sites of the rampant piracy that undermines creativity. Your business model can’t be simultaneously based on both intimate, granular details about users and no clue whatsoever about rather obvious pirate sites.

Another area that urgently needs much attention is the algorithms that Silicon Valley companies, and Amazon, routinely cite as a supposedly objective source of wisdom and insight. These algorithms are obviously set, tuned and repeatedly adjusted to suit their commercial needs. Yet they also blame autonomous, anarchic algorithms and not themselves when neofascist content surfaces or when a search leads to obviously biased results in favor of their own products.

Look at how Google games searches. A study reported in The Wall Street Journal found that in 25,000 random Google searches ads for Google products appeared in the most prominent slot 91% of the time. How is that not the unfair leveraging of search dominance and the abuse of algorithm? All 1,000 searches for “laptops” started with an ad for Google’s Chromebook—100% of the time. Kim Jong Un would be envious of results like that at election time.

And then there are the recently launched Google snippets, which stylistically highlight search results as if they were written on stone tablets and carried down from the mountain. Their sheer visual physicality gives them apparent moral force. The word Orwellian is flagrantly abused, but when it comes to the all-powerful algorithms of Google, Amazon and Facebook, Orwellian is underused.

As for news, institutional neglect has left us perched on the edge of the slippery slope of censorship. There is no Silicon Valley tradition, as there is at great newspapers, of each day arguing over rights and wrongs, of fretful, thoughtful agonizing over social responsibility and freedom of speech.

What we now have is a backlash with which these omnipotent companies are uniquely ill-equipped to cope. Their responses tend to be political and politically correct. Regardless of your own views, you should be concerned that we are entering an era in which these immensely influential publishers will routinely and selectively “unpublish” certain views and news.

We stumble into this egregious era at a moment when the political volume in many countries is turned to 10. The echo chamber has never been larger and the reverb room rarely more cacophonous. This is not an entirely new trend, but it has a compounding effect with the combination of “holier than thou” and “louder than thou.”

Curiously, this outcome is, in part, a result of the idealism of the Silicon Valley set, and there’s no doubt about the self-proclaimed ideals. They devoutly believe they are connecting people and informing them, which is true, even though some of the connections become conspiracies and much of the information is skimmed without concern to intellectual property rights. Ideas aside, we were supposed to be in a magic age of metrics and data. Yet instead of perfect precision we have the cynical arbitraging of ambiguity—particularly in the world of audiences. Some advertising agencies are also clearly at fault because they, too, have been arbitraging and prospering from digital ambiguity as money in the ad business has shifted from actually making ads to aggregating digital audiences and ad tech, better known as fad tech.

And so, as the Times of London has reported, socially aware, imageconscious advertisers find themselves in extremely disreputable places— hardcore porn sites, neofascist sites, Islamist sites. The embarrassment for these advertisers juxtaposed with jaundice is understandable, but the situation is far more serious than mere loss of face.

If these sites are getting a cut of the commission, the advertisers are technically funding these nefarious activities. Depending on the type of advertising, it is estimated by the ad industry that a YouTube partner could earn about 55% of the revenue from a video. In recent years, how many millions of dollars have been channeled to organizations or individuals that are an existential threat to our societies?

Provenance is profound, and in this age of augmented reality and virtual reality, actual reality will surely make a comeback. Authenticated authenticity is an asset of increasing value in an age of the artificial—understanding the ebb and flow of humanity will not be based on fake news or ersatz empathy, but on real insight.

Mr. Thomson is the chief executive of News Corp, which owns The Wall Street Journal. This is adapted from a speech he delivered on March 29 to the Asia Society in Hong Kong.


Bill Gates on Robots

Bill Gates is a smart guy, but I think he is missing the point with his view of robots and the impact on jobs.
WSJ 4/26/2017 By Andy Kessler

Bill Gates, meet Ned Ludd.

Ned, meet Bill. Ludd was the 18thcentury folk hero of anti-industrialists. As the possibly apocryphal story goes, in the 1770s he busted up a few stocking frames—knitting machines used to make socks and other clothing—to protest the labor-saving devices. Taking up his cause a few decades later, a band of self-described “Luddites” rebelled by smashing some of the machines that powered the Industrial Revolution.

Apparently this is the sort of behavior that would make Mr. Gates proud. Last month in an interview with the website Quartz, the Microsoft founder and richest man alive said it would be OK to tax job-killing robots. If a $50,000 worker was replaced by a robot, the government would lose income-tax revenue. Therefore, Mr. Gates suggested, the feds can make up their loss with “some type of robot tax.”

This is the dumbest idea since Messrs. Smoot and Hawley rampaged through the U.S. Capitol in 1930. It’s a shame, especially since Bill Gates is one of my heroes.

When I started working on Wall Street, I was taken into rooms with giant sheets of paper spread across huge tables. People milled about armed with rulers, pencils and X-Acto Knives, creating financial models and earnings estimates.

Spreadsheets, get it? This all disappeared quickly when VisiCalc, Lotus 1-2-3 and eventually Microsoft Excel automated the calculations. Some fine motor-skill workers and maybe a few math majors lost jobs, but hundreds of thousands more were hired to model the world. Should we have taxed software because it killed jobs? Put levies on spell checkers because copy editors are out of work?

Mr. Gates killed as many jobs as anyone: secretaries, typesetters, tax accountants— the list doesn’t end. It’s almost indiscriminate destruction. But he’s my hero because he made the world productive, rolling over mundane and often grueling jobs with automation. The American Dream is not sorting airline tickets, setting type or counting $20 bills. Better jobs emerged.

Mr. Gates may be worth $86 billion—who’s counting?—but the rest of the world made multiples of his fortune using his tools. Society as a whole is better off. In August 1981, when Microsoft’s operating system first began to ship, U.S. employment stood at 91 million jobs. The economy has since added 53 million jobs, outpacing the rate of population growth.

Even better, the Third World is rising out of poverty because of improved logistics from personal computers and servers. This has dramatically lowered the cost of basic food, energy and health care. None of this happens without productive tools—doing more with less.

What’s most disturbing is that the Luddites never totally went away. How many times have we been subject to proposals that would tax progress? ObamaCare’s regulations froze the medical industry. Its 2.3% medical-device tax was even worse, discouraging investment in one of the few innovative health-care sectors. Mileage standards on automobiles were a waste of resources contributing to the moronic Detroit bailout in 2009. Even a carbon tax is Ludd-like, raising the cost of energy to slow its consumption.

There is a murmuring movement out of Europe known as “degrowth.” If this sounds to you like a cabal of cave dwellers, you’re not that far off. Degrowth Week in Budapest last summer featured enchanting sessions like this one: “Popular competence building against the Technocracy.” Channeling Ludd, industrial insurgents and sustainability samurais want to keep things the way they are, like the eco-protesters at Standing Rock. The site degrowth. org is clear about the movement’s unproductive goals: Consume less and share more.

OK, but do you want to give up Google Maps, Snapchat and future innovations? Pry them out of my cold dead thumbs. Surely Mr. Gates knows that his charitable foundation’s efforts to eradicate malaria and other diseases require a lot of productive capital and hard work. I can’t picture him clamoring to tax robots that lower the cost of malaria drugs or mosquito nets. That kind of tax would kill off the next wave of disease-killing productivity.

I don’t think Mr. Gates wants to be the poster boy for the degrowth movement. He knows how hard progress is. After PCs, Microsoft missed the start of every subsequent technology trend: browsers, video streaming, search, smartphones and cloud computing. Today the company is playing catchup with neural computing, which drives image recognition and other robotic cognitive skills. This type of innovation, even if it destroys jobs near-term, needs to be nurtured and encouraged. Burden progress with taxes, and degrowth is what you’ll get.

Mr. Kessler, a former hedgefund manager, is the author of “Eat People” (Portfolio, 2011).