Category Archives: Technology

Technology in the classroom?

I would have told them years ago that given a choice, I would choose a school with ZERO technology over one with the best technology.

WSJ 9/4/2019

Districts spent millions on laptops, apps and devices; parents worry it’s not helping.

When Baltimore County, Md., public schools began going digital five years ago, textbooks disappeared from classrooms and paper and pencils were no longer encouraged. All students from kindergarten to 12th grade would eventually get a laptop, helping the district reach the “one-to-one” ratio of one for each child that has become coveted around the country. Teaching apps and digital courses took the place of flashcards and notebooks.

Despite the investment, academic results have mostly slipped in the district of about 115,000 students.

Over the last decade, American schools embraced technology, spending millions of dollars on devices and apps, believing its disruptive power would help many children learn faster, stay in school and be more prepared for a competitive economy. Now many parents and teachers are starting to wonder if all the disruption was a good idea.

Technology has made it easier for students and teachers to communicate and collaborate. It engages many students and allows them to learn at their own pace. But early indications are that tech isn’t a panacea for education.

Researchers at Rand Corp. and elsewhere say there is no clear evidence showing which new tech-related education offerings or approaches work in schools.

The uncertainty is feeding alarm among some parents already worried about the amount of time their children spend attached to digital devices. Some believe technology is not doing much to help their kids learn, setting up a clash with tech advocates who say technology is the future of education.

Across the country—in Boston, Fort Wayne, Ind., and Austin, Texas—parents are demanding proof technology works as an educational tool, and insisting on limits. They’re pushing schools to offer low-or screen-free classrooms, picketing board meetings to protest all the online classes and demanding more information about what data is collected on students.

In April, a report from the National Education Policy Center, a nonpartisan research group at the University of Colorado at Boulder, found the rapid adoption of the mostly proprietary technology in education to be rife with “questionable educational assumptions… self-interested advocacy by the technology industry, serious threats to student privacy and a lack of research support.”

‘Digital citizens’

Proponents say schools must have technology. “We are moving into a time of exponential change,” says Keith Krueger, CEO of Consortium for School Networking, an association for school tech officers that also includes tech companies. “Schools are not at the leading edge of technology but technology is reaching a tipping point in the way learning happens.” He says, “Schools need to determine how to equip [students] to be smart digital citizens.”

Earlier this year, Cynthia Boyd, watched as her daughter, Jane, then a Baltimore County first-grader, poked at a keyboard on a family laptop in response to math problems on the screen, part of her public school’s math class. An interactive program delivered an image of a token on the screen when she completed an assignment. She could then trade in those tokens to play games.

To Dr. Boyd, a professor of medicine at Johns Hopkins, it looks more like a videogame than a math class. She isn’t sure if the lessons are sticking with Jane and worries about the hyper-stimulating screen time. “I feel like my kids have been part of a huge massive experiment I have no control over,” says Dr. Boyd, who also has two sons aged 13 and 15. Liam, the younger son, began learning on screens three years ago and Graham, the older son, began learning on them four years ago.

There is a role for technology in school, she says, but it is a matter of how much and at what age. She has seen the benefits: Her younger son learned the basics of computer programming and the logic in- volved. Tech allowed her oldest son to build a website for a history project and to use online original documents from the Library of Congress.

Baltimore County’s latest results on state standardized tests, released last week, were so disappointing they prompted a letter to parents from the district’s new school superintendent, Darryl Williams. In English language arts and math, “far too few of our students are meeting or exceeding expectations…We can and must do better,” he wrote.

About 37% of the county’s students in grades 3-8 were proficient in English language arts, compared with about 44% statewide. In math, about 27% of the students were proficient, compared with 33% statewide. Dr. Williams wasn’t available for comment. A school district spokesman said Dr. Williams and his team will be closely evaluating what role, if any, the laptop program has played in the district’s performance.

Baltimore County said earlier this year it would scale back the ratio of laptops in first and second grades to one for every five students. A few miles away, in Montgomery County, a new curriculum this fall will return textbooks, paper and pencils to the classroom to supplement laptops.

In both cases, school officials say they were responding, in part, to parents and teachers. Baltimore County’s early-learning teachers said they didn’t need so many laptops. Parents wanted their children to “have a mixed media experience touching paper and reading books and down on the carpet without a device in their hands,” said Ryan Imbriale, who heads the school district’s department of innovative learning.

In Montgomery County, which was due for a curriculum change, “we heard loud and clear from parents,” said Dr. Maria Navarro, chief academic officer. Among other things, “they are very concerned about the amount of screen time” in school.

Technology is especially effective, she says, when it allows aspiring engineers and architects to build simulations in a high-school drafting class, for instance, or non-native English speakers to get an instant translation by touching a word in their reading assignment. “The question is what’s the best use of this tool,” she says, “and what’s the right quality and quantity.”

Montgomery County schools improved in nearly all categories in the results released last week, with average gains better than those seen statewide for students in grades 3 through 8.

Technology certainly helps solve some big problems. At a time when all 50 states report teacher shortages, instructors are being live-streamed into classrooms to remotely fill vacancies in about 180 school districts. In places such as Hot Springs, Va., Alphabet Inc.’s Google has wired school buses, turning them into rolling study halls for students with long commutes and patchy or nonexistent Wi-Fi at home.

The widespread prevalence of technology in American schools, often known for changing at a glacial pace, has been jarring for large numbers of parents, teachers and administrators— many of whom were big proponents just a few years ago. The school devices are undermining parents’ struggle to limit screen time. They say their children find ways at school to bypass their schools’ internet filters and can find their way onto YouTube exposing them to X-rated and violent content. They can kill time in class by shopping online or playing games.

It was just eight years ago that President Obama signaled in his State of the Union Address that he envisioned U.S. students reading digital textbooks; soon after, his administration set a five-year goal. The call was hardly controversial; it was premised on the widespread belief that digital innovations would boost graduation rates, help close the persistent socio-economic achievement gap and keep American students from falling behind their peers in other countries.

The Trump administration’s Education Secretary Betsy De-Vos is a proponent, saying in a statement “When applied appropriately, technology in the classroom opens up a world of possibilities for students.”

Device usage in schools is rising. A 2018 report by the Consortium for School Networking found 59% of high schools reported that all of their students had access to non-shared devices, compared with 53% the year prior. Middle schools reported 63% of all students had access from 56%. Elementary use was at 29% from 25%. In some cases, schools provide the devices and in others, students bring their own.

Google has 60% of the school device market; Apple Inc. and Microsoft Corp. split the rest about equally, according to Futuresource Consulting, a market research firm.

The digital push coincided with the rise in enthusiasm for personalized learning. Championed by advocates including the Bill and Melinda Gates Foundation, the tailored-to-the-child approach is designed to allow students to learn according to their needs and interests— at their own pace. The flexibility is made possible by individually assigned laptops and tens of thousands of new digital learning programs.

“Technology is one tool—an extremely promising one if we use it well. But we have to be clear-eyed,” says Robert L. Hughes, who directs U.S. K-12 Education for the Gates foundation. He says there is still “wide variation” in performance— between classrooms, schools and districts using technology to personalize learning. “It is the very early days.”

Mr. Gates has indicated problems are to be expected in the early stages of educational software development. “The stand-alone textbook is becoming a thing of the past,” he wrote in the foundation’s annual letter in February.

What happens next? “The same basic cycle you go through for all software: get lots of feedback on the existing products, collect data on what works and make them better,” he wrote. A foundation spokesman said Mr. Gates wasn’t available to comment.

The individualized learning approach assumes that if given choices and goals, children will stay motivated to do their best. Some parents say that is asking a lot of many kids. “Put screens in front of children and they aren’t thinking ‘I can’t wait to do research,’ they’re thinking, ‘Let’s play Candy Crush,’ ” says Melanie Hempe, a Charlotte, N.C., mother and author of a book for parents who want to limit computer use called “Screen Strong.”

Maryland is one of the states grappling most aggressively with how to calibrate the role of technology in schools, this summer requiring officials to create a set of best practices for devices in the classroom. About half the guidelines address health issues like proper ergonomics and eye safety. Others remind teachers to promote student collaboration and reward good behavior with social and physical activity—not more screen time.

Many teachers support using technology broadly, but are concerned over whether it is being used too much or in the right way. Diane Birdwell, a world history teacher in Dallas, said when she allowed students to use devices, they were not only distracted by pop-up alerts but didn’t retain information or comprehend as well when reading it on paper. “It has hampered their ability to think on their own,” said Ms. Birdwell, a 58-year-old teacher of 20 years.

Florence Kao, a Montgomery County, Md., lawyer, says that since third grade, her two sons have been using Google Slides, part of Google’s office suite, for their projects. They are now PowerPoint design experts, she says, picking colors, fonts, background images and deciding whether to put each slide in a bubble or a frame. “The ratio of time they spend writing on each slide compared to embellishing it is probably 10% to 90%,” she says. She wonders how well they’re learning to write.

Vetting Apps

Montgomery County schools have roughly 1,000 apps, digital curriculum offerings and other online tools that teachers have chosen to use. Some Montgomery County parents formed a Safe Tech Committee that now meets with the district’s chief tech officer, Pete Cevenini to report problem apps and share other concerns. He has set up a protocol for teachers and administrators to vet apps. As the start of the school year approaches, 22% have been vetted.

A report from Rand Corp. in October cited a lack of rigorous evidence showing which new education practices and tools are effective, saying the offerings are “relatively immature, fragmented and of uneven quality.” In a peer-reviewed article, the research firm described strategies to guide teachers and administrators “in the absence of proven-effective models.” Dr. Boyd, the Johns Hopkins professor, says all her children have been frustrated by digital courses. Her sons have since switched to private schools where the curriculum is less technology intensive. She is glad Jane’s school district is reevaluating and making changes.

‘I feel like my kids have been part of a huge massive experiment.’

Share

The Deep Dangers of Life Online – WSJ

By Daniel Henninger WSJ 8/7/2019

The online forum 8chan is better known than it was last week because the El Paso shooter, Patrick Crusius, uploaded his “manifesto” to the site before he murdered 22 people. 8chan has also been linked to the mass murder in Christchurch, New Zealand, and to a killing in April at a synagogue in California. A similar online forum, Gab.com, was allegedly used by the shooter who killed 11 people at a Pittsburgh synagogue in October. The Dayton shooter, Connor Betts, spent much of his time raving on Twitter .

8chan describes itself a forum for unimpeded, uncensored “free speech.” That is wrong. 8chan is a nut house. But don’t blame 8chan, Gab.com or Twitter. Blame the internet. More specifically, blame the inevitable deterioration of lives lived online.

It is conventional wisdom that the internet has become a toxic force. Possibly the past year’s most astonishing news was that parents in Silicon Valley, where life online was created, are trying to keep their children away from screens.

As far as I know, none of this was predicted.

The internet was developed in the 1960s mainly by the U.S. government’s Defense Advanced Research Projects Agency to ensure the country had a communications system that could survive a nuclear attack. Based on the structure of the internet, the World Wide Web appeared in the early 1990s.

It isn’t possible to overstate the magic of the web. Or at least its early magic. With just a click, one could summon forth the whole world in pixels on a screen.

Once software engineers mastered the technology beneath the web, they manufactured countless apps that let everyone do just about anything—texting, image sharing, self-monitoring.

Cellphones, invented to make personal telephone calls possible anywhere, eventually allowed people to hold the web’s magic in their hands and go deeper and deeper into what it offered. Now, like the sorcerer’s apprentice, we are discovering that the magic can turn uncontrollably malignant.

Going back several years, I was startled at stories that began to appear in the Chronicle of Higher Education about the high incidence of anxiety among college students, and how university mental-health clinics were hard-pressed to handle their requests for help. “According to data from the 2013 National College Health Assessment,” said one of these stories, “nearly half of 123,078 respondents from 53 colleges and universities across the country felt overwhelming anxiety over the previous year and a third had problems functioning because of depression.”

A third had problems with depression? At first I dismissed these stories as overstating the normal anxieties and coping problems of young people. But I don’t think that anymore. Eve n anecdotally the rise in emotional instability is obvious. What happened?

Anxiety has existed since Adam and Eve. W.H. Auden memorialized its modern version in his long poem, “The Age of Anxiety” (1947). It comes and goes occasionally for everyone. Let us posit for discussion that anxiety runs along a scale of 1 to 100. Below 50, people cope. Above 50, anxiety starts tipping toward neurosis and more difficult coping challenges until it extends out to 100 and personal destruction.

It’s remarkable to think that across millennia, even after Freud popularized the idea of neurosis, most people managed to stay below 50—until about the year 2000. Then, as surveys suggest, it appears that masses of people—especially in the U.S., for some reason—started finding themselves drifting past 50, into deeper and more dangerous levels of anxiety. It now seems clear that one consequence of more people hitting 100 is more mass murder by young men who simply break down.

Whether the adaptability of the human brain is the invention of God or Darwin, I don’t think it was designed to endure the volume of relentless inner-directedness that is driven by these new screens. It is not natural or normal.

Anyone who spends that much time immersed inside their own psyche is headed for trouble—whether adolescent girls staring at images of women Photoshopped to perfection, college students measuring themselves constantly against other students, or young men in a state of daily or hourly anger over immigrants and other enemies. The stakes online become impossibly and inhumanly high.

A more pertinent question this tragic week is why the American system stands frozen amid what looks like a quiet epidemic of psychological and emotional erosion. The speed with which the system—politicians and the press—defaulted from the killers themselves into a paroxysm over Donald Trump has been, well, depressing.

Gun-control laws? Maybe, but no serious person can believe they would be much more than thumbs in a dike standing against a more massively destructive force turning young men into zombielike killers. Forget alt-right and alt-left. Life lived online, as practiced by many, is a destructive, dehumanizing alt-reality.

The screen genies are out of the bottle. Banning them won’t work. Maybe the app masters who elevated self-obsession on Instagram and 8chan could turn toward apps rooted in reality. After last week, we have nowhere to go but up.

Share

Not to Late to Quit Social Media – Cal Newport

Should be required reading for all. [As in ‘everyone’].
=====
WSJ 1/26/2019
by By Kate Bachelder Odell

Americans may not agree on much, but here’s one point of consensus: Social media isn’t entirely wonderful. Facebook has its privacy scandals, and who would join Twitter for the camaraderie? This week an ugly online mob demonstrated the point by setting upon a group of boys on a field trip to Washington from Kentucky’s Covington Catholic High School.

“Because I don’t have any social-media accounts,” says Cal Newport, a Georgetown University computer scientist, “my encounter with the Covington Catholic controversy was much different than most people’s.” He read about it days later, in a newspaper column. “I learned that the social-media reaction had been incendiary and basically everyone was now upset at each other, at themselves, at technology itself. It sounded exhausting.”

Mr. Newport, 36, appreciated the downsides of social media sooner than most. In 2010 he published “An Argument for Quitting Facebook,” a blog post that came with a graphic of the “deactivate account” function on an amusingly out-of-date Facebook version. “Technologies are great,” he wrote, “but if you want to keep control of your time and attention,” you should “insist that they earn their keep before you make them a regular part of your life.” He has been proselytizing against social media ever since. His book on the subject, “Digital Minimalism: Choosing a Focused Life in a Noisy World,” hits stores (and e-readers) next month.

He has never had a social-media account. (“It turns out that this is allowed,” he once joked on his blog.) But he noticed that social media seemed to impair others’ ability to concentrate—an essential skill for professional and personal success. “Right around the transition to mobile” from desktop computers, he tells me, he observed that for many people a passing interest in social media was morphing into “compulsive use.”

“Old social media was a much slower-moving medium,” he says. “You would maybe update your profile occasionally. So if you went on to check what your friends were up to in the morning, there would be no reason to check in the afternoon. Nothing had changed.”

Then came the smartphone—a pocket-size supercomputer that travels everywhere. Social media became a ubiquitous presence. That suited the commercial interests of social-media companies, which “couldn’t triple or quadruple the user-engagement numbers if people log on Monday just to see if someone is back from vacation.” They needed to reel users back in. “And this is where you get the rise of, let’s say, the ‘like’ button or tagging photos,” or retweets and heart buttons—what Mr. Newport calls “small indicators of approval.”

These created “a much richer stream of information coming back to the user,” which proved seductive: “Now you have a reason to click the app again an hour later.” The reinforcement is all the more insidious for being intermittent. Sometimes you’re rewarded for checking in, sometimes you’re frustrated. “It just short-circuits the dopamine system,” Mr. Newport says—which feeds the compulsion. He likens using social media at work to “having a slot machine at your desk.”

Facebook introduced a feature that recognizes faces in photos and encourages users to tag their friends. “That’s a really hard computer-science problem,” Mr. Newport says. “Why would you spend millions of dollars to try to master that problem?” Because, he maintains, it’s another indicator of approval that lures users back to the site.

Mr. Newport says he used to write “earnest” blog posts asking readers what he was missing by not having social media, and he “couldn’t get a straight answer,” aside from the question-begging one that he might miss something. But in recent years skepticism has been growing. Facebook is on the defensive, with CEO Mark Zuckerberg offering reassurances like “I believe everyone should have a voice and be able to connect.”

Yet all those voices can add up to a grating din, as in the Covington Catholic fracas. “Is anyone better off for having wasted hours and hours of time this past week exhaustively engaging half-formed back-and-forth yelling on social media?” Mr. Newport asks. It sounds like a rhetorical question, but then he answers it: “On reflection, the answer is yes for one group in particular: the executives at the giant social-media conglomerates, who sucked up all those extra ‘user engagement’ minutes like an oil tycoon who just hit a gusher.”

Political engagement, however, is a mixed blessing for social-media companies. Mr. Newport says public skepticism reached critical mass “about six months after the presidential election.” Everyone had something to dislike: “If you’re more on the left, it was the election manipulation; if you’re more on the right, it was these stories about ‘Are we being censored?’ ” he says.

Those complaints brought to the surface deeper sources of dissatisfaction. “When I talk to people now who are very distressed about their digital life,” Mr. Newport says, “it’s not those original political things that they care about. It’s not that ‘I don’t like what Russia did in the election’; it’s, ‘I’m on this more than is useful, more than is healthy. It’s keeping me from my kids, It’s keeping me from my friends. It’s keeping me from things I used to enjoy. I think it’s hurting the quality of my life.’ ”

In hindsight, Mr. Newport says, “we should have been more wary about this idea” of taking human sociality—“incredibly powerful and shaped by a million years of evolution”—and allowing 22-year-olds in California to reinvent it.

So what now? Mr. Newport laments that everyone “writes the same article” with tips for turning off notifications or some such. “This is not working.” What people need, he thinks, is “a full-fledged philosophy” of how to use technology. About a year ago Mr. Newport invited his blog’s readers to participate in an experiment he called a “digital declutter.”

The prescription: Take a month off from all digital technologies you don’t absolutely have to use—including Facebook, Twitter, Instagram, even casual texting with friends. Spend the days figuring out what you’d like to do with your time. At the end of the experiment, resume technologies only to the extent that they’re the best way to accomplish something you value deeply.

A couple hundred readers sent Mr. Newport detailed reports. One theme he noticed is that time online had crowded out activities like joining a church committee or a running club. Mr. Newport calls them “analog social media,” an amusing retronym. But he isn’t being cheeky. He says people had failed to realize the extent to which the internet “had subtly pushed the analog leisure they used to like out of their life.”

Human beings “crave high-quality leisure,” he says, but they can do without it “if you can fill every moment with distraction.” When the digital declutterers regained time to cook or see close friends, “they essentially lost their taste” for staring at the screen and scrolling.

When I meet Mr. Newport, I’m on Day 17 of my own declutter experiment. Facebook was the easiest platform to dump. I’d already stopped posting and visited only occasionally, mostly—this is embarrassing—to keep up with a group run by my dog’s breeder, where other owners post photos of their dogs.

Mr. Newport says I’m not unusual. “Man, I’m glad I don’t own their stock,” he says. “They’re worth a lot of money,” he allows, but they seem to have “a very weak connection to their user base. It’s a much more fickle user base than they probably want to admit. Because people—I get this experience all the time—people are fine walking away from it. They’re really indifferent.” That may be less true of other Facebook-owned services like Instagram, which I’ve found tough to ditch, or WhatsApp.

How about Twitter? For journalists, it’s an office water cooler, except that everyone yells. It can also be a valuable tool for gathering news. “Everyone I know in media is having this exact same crisis with Twitter,” Mr. Newport says. “It’s either ‘Burn it to the ground’ or ‘It’s at the core of what I do,’ and they’re not sure.” Mr. Newport advises outlets to have entry-level employees monitor Twitter rather than let the site sap the entire staff’s productivity.

Though in his day job Mr. Newport writes technical works like the 2018 paper “Fault-Tolerant Consensus With an Abstract MAC Layer,” he’s been at the self-improvement game for a while. He started out writing books about how students could stand out in high school and college. His career followed his interests as he progressed from graduate student to professor and wondered why some people thrive professionally and others don’t. That led to his career-advice books, “So Good They Can’t Ignore You” and “Deep Work.”

The latter takes aim at another technology that corrodes the ability to focus: corporate email. “We sort of gambled on this idea that the key to productivity is going to be faster and more flexible communication,” Mr. Newport says. “At any moment we can have fast and flexible communication with anyone on earth.” Yet productivity has hardly budged. “It’s actually probably going down.”

The statistics, he says, fail to “capture the sort of secret second shifts that people are doing at night and on the weekends just to try to catch up.” It turns out that “focusing on fast and flexible communication—my argument is—didn’t make us more productive.” Instead, “it made our brains much, much less effective at the actual work.”

To illustrate the point, he sometimes cites an interview with Jerry Seinfeld. “Let me tell you why my TV series in the ’90s was so good,” the comedian said. “In most TV series, 50% of the time is spent working on the show; 50% of the time is spent dealing with personality, political, and hierarchical issues of making something. We”—he and Larry David—“spent 99% of our time writing. Me and Larry. The door was closed. Somebody calls. We’re not taking the call. We were gonna make this thing funny. That’s why the show was good.”

To have an excellent career, Mr. Newport argues, you need periods of uninterrupted concentration to produce work of unambiguous value. Many jobs lack a clear measure of value, so that employees treat “busyness as a proxy for productivity” and let email distract them from real work.

Yet as with social media, the thought of giving up email stirs a fear of missing out. Some may protest—as I did—that if they quit Instagram they’ll lose track of old friends. Mr. Newport replies that “this idea that it’s important to maintain hundreds or thousands of weak-tie connections” is recent and untested. “I can’t see any great evidence that this is important to have.” And it can keep you from “investing more time into the types of relationships that have defined human sociology for centuries, which are close friends, family members and community.”

Fewer people “will send you a digital happy-birthday note,” he concedes. “But that’s about it.”

Mrs. Odell is an editorial writer for the Journal.

Appeared in the January 26, 2019, print edition.

Share

The Interconnected World and Martin Luther?

Some thoughts about how things change, and yet how much stays the same…
=========
By Niall Ferguson
Originally published in The Sunday Times, October 1, 2017

Just as Martin Luther’s utopian vision and the invention of the printing press led to an era of religious war and turmoil, the internet, hailed as a portal to a better world, is threatening democracy

The hyperconnected world was not supposed to be like this. In May, Evan Williams, one of the founders of Twitter, told The New York Times: “I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place. I was wrong about that.”

In September Sheryl Sandberg, Facebook’s chief operating officer, acknowledged that the company’s online tools had allowed advertisers to target self-described “Jew haters”. “We never intended or anticipated this functionality being used this way,” she admitted, “and that is on us.”

Surprise! The men and women who built the internet-based social networks that have so transformed our lives thought everything would be awesome if only we could all be connected. Speaking at Harvard’s degree ceremony in May, Facebook’s co-founder and chief executive, Mark Zuckerberg, looked back on his undergraduate ambition to “connect the whole world”. “This idea was so clear to us,” he recalled, “that all people want to connect . . . My hope was never to build a company, but to make an impact.”

Facebook certainly made an impact last year, but not quite the impact the young Zuckerberg had in mind in his Harvard dorm. A committed believer in globalisation who tends to wear his liberal politics on his T-shirt sleeve, Zuckerberg is reeling. Not only did the masterminds behind the Brexit and Trump campaigns successfully use Facebook advertising to hone and target their ultimately victorious campaign messages; worse, the Russian government appears to have used Facebook in the same way, seeking to depress voter support for Hillary Clinton. Worse still, neo-Nazis seem to have been using the social network to spread their own distinctive brand of hate.

Yet the architects of the biggest social networks to have existed should not have been surprised. If he had studied history at Harvard rather than psychology and computer science, Zuckerberg might have foreseen the ways in which Facebook and its ilk would be used and abused.

Five hundred years ago this year, Martin Luther sent his critique of corrupt church practices as a letter to the Archbishop of Mainz. It is not wholly clear if Luther also nailed a copy to the door of All Saints’ Church, Wittenberg, but it scarcely matters. Thanks to the invention of the printing press by Johannes Gutenberg, that mode of publishing had been superseded.

Before 1517 was out, versions of Luther’s original Latin text had been printed in Basel, Leipzig and Nuremberg. By the time Luther was officially condemned as a heretic by the Edict of Worms in 1521, his writings were all over German-speaking Europe. In the course of the 16th century, German printers produced almost 5,000 editions of Luther’s works.

Luther’s vision was utopian. Just as Zuckerberg today dreams of creating a single “global community”, so Luther believed that his Reformation would produce a “priesthood of all believers”, all reading the Bible, all in a direct relationship to the one, true God.

It didn’t turn out that way. The Reformation unleashed a wave of religious revolt against the authority of the Roman Catholic Church. As it spread from reform-minded clergymen and scholars to urban elites to illiterate peasants, it threw first Germany and then all of northwestern Europe into turmoil.

In 1524 a full-blown peasants’ revolt broke out. By 1531 there were enough Protestant princes to form an alliance (the Schmalkaldic League) against the Holy Roman Emperor, Charles V. Although defeated, the Protestants were powerful enough to preserve the Reformation in a patchwork of territories.

Religious conflict erupted again in the Thirty Years’ War, a conflict that turned central Europe into a charnel house. Especially in northwestern Europe – in England, Scotland and the Dutch Republic – it proved impossible to re-establish Roman Catholicism, even when Rome turned the technologies and networking strategy of the Reformation against it, in addition to the more traditional array of cruel tortures and punishments that had long been the church’s forte.

The global impact of the internet has few analogues in history better than the impact of printing on 16th-century Europe. The personal computer and smartphone have empowered networks as much as the pamphlet and the book did in Luther’s time.

Indeed, the trajectories for the production and price of PCs in America between 1977 and 2004 are remarkably similar to the trajectories for the production and price of printed books in England from 1490 to 1630.

In the era of the Reformation and thereafter, connectivity was enhanced exponentially by rising literacy, so that a growing share of the population was able to access printed literature of all kinds, rather than having to rely on orators and preachers to convey new ideas to them.

There are three major differences between our networked age and the era that followed the advent of European printing. First, and most obviously, our networking revolution is much faster and more geographically extensive than the wave of revolutions unleashed by the German printing press.

In a far shorter space of time than it took for 84% of the world’s adults to become literate, a remarkably large proportion of humanity has gained access to the internet. As recently as 1998 only about 2% of the world’s population were online. Today the proportion is two in five. The pace of change is roughly an order of magnitude faster than in the post-Gutenberg period: what took centuries after 1490 took just decades after 1990.

[This is just unbelievable.] Google started life in a garage in Menlo Park, California, in 1998. Today it has the capacity to process more than 4.2bn search requests every day. In 2005 YouTube was a start-up in a room above a pizzeria in San Mateo. Today it allows people to watch 8.8bn videos a day. Facebook was dreamt up at Harvard just over a decade ago. Today it has more than 2bn users who log on at least once a month.

The scale of Facebook’s success is especially staggering. Two-thirds of American adults are Facebook users. Just under half get their news from Facebook.

It used to be said that there were six degrees of separation between any two individuals on the planet – say, between yourself and Monica Lewinsky. On Facebook there are just 3.57 degrees of separation, meaning that any two of the 2bn Facebook users can get in touch by taking fewer than four steps through the network. The world is indeed connected as never before. We are all friends of friends of friends of friends.

Second, the distributional consequences of our revolution are quite different from those of the early-modern revolution. Early modern Europe was not an ideal place to enforce intellectual property rights, which in those days existed only when technologies could be secretively monopolised by a guild. The printing press created no billionaires.

Johannes Gutenberg was not Bill Gates (indeed, by 1456 he was effectively bankrupt). Moreover, only a subset of the media made possible by the printing press – the newspapers and magazines invented in the 18th century – sought to make money from advertising, whereas all the most important ones made possible by the internet do. Few people foresaw that these giant networks would be so profoundly inegalitarian.

To be sure, innovation has driven down the costs of information technology. Globally, the costs of computing and digital storage fell at annual rates of, respectively, 33% and 38% between 1992 and 2012. Everyone has benefited from that. However, oligopolies have developed in the realms of both hardware and software, as well as service provision and wireless networks.

The ownership of the world’s electronic network is extraordinarily concentrated. Google (or rather the renamed parent company, Alphabet Inc) is worth $669bn by market capitalisation. About 16% of its shares, worth around $106bn, are owned by its founders, Larry Page and Sergey Brin. The market capitalisation of Facebook is approaching $500bn; 475 million of the shares, worth about $81bn, are owned by its T-shirt-loving founder.

Unlike in the past, there are now two kinds of people in the world: those who own and run the networks, and those who merely use them.

Third, the printing press had the effect of disrupting religious life in western Christendom before it disrupted anything else. By contrast, the internet began by disrupting commerce; only very recently did it begin to disrupt politics, and it has really disrupted only one religion, namely Islam.

The political disruption reached a climax last year, when social networks helped to topple David Cameron in the Brexit referendum and to defeat Hillary Clinton in the US presidential election.

In the American case, a number of networks were operating. There was the grassroots network of support that the Trump campaign built – and that built itself – on the platforms of Facebook and Twitter. These were the “forgotten” men and women who turned out on November 8 to defeat the “failed and corrupt political establishment” that Trump’s opponent was said to personify.

A role was also played by the jihadist network, as the Isis-affiliated terror attacks during the election year lent credibility to Trump’s pledges to “strip out the support networks for radical Islam” and to ban Muslim immigration.

Yet in two respects there is a clear similarity between our time and the revolutionary period that followed the advent of printing. Like the printing press, modern information technology is transforming not only the market – most recently, by facilitating the sharing of cars and homes – but also the public sphere. Never before have so many people been connected in an instantly responsive network through which “memes” can spread even more rapidly than natural viruses.

But the notion that taking the whole world online would create a utopia of netizens, all equal in cyber-space, was always a fantasy – as much a delusion as Luther’s vision of a “priesthood of all believers”. The reality is that the global network has become a transmission mechanism for all kinds of manias and panics, just as the combination of printing and literacy for a time increased the prevalence of millenarian sects and witch crazes. The cruelties of Isis seem less idiosyncratic when compared with those of some governments and sects in the 16th and 17th centuries.

Second, our time is seeing an erosion of territorial sovereignty. In the 16th and 17th centuries, Europe was plunged into a series of religious wars. Spain and France tried by fair means and foul to bring England back to the Roman Catholic fold. As late as 1745, a French-backed army of Scottish Highlanders invaded England with a view to restoring the old faith in the British Isles.

In the 21st century, we see a similar phenomenon of escalating intervention in the domestic affairs of sovereign states. There was, after all, a further network involved in the US election of 2016, and that was Russia’s intelligence network.

It is clear that the Russian government did its utmost to maximise the damage to Clinton’s reputation stemming from her and her campaign’s sloppy email security, using WikiLeaks as the conduit through which stolen documents were passed to the western media. Russian hackers and trolls last year posed a threat to American democracy similar to the one that Jesuit priests posed to the English Reformation: a threat from within sponsored from without.

Leave aside the question of whether or not the Russian interference decided the election in favour of Trump; suffice to say it helped him, though both fake and real news damaging to Clinton was also disseminated without Russia’s involvement. Leave aside, too, the as yet unresolved questions of how many members of the Trump campaign were complicit in the Russian operation, and how much they knew.

The critical point is Facebook itself may have decided the outcome of an election that would have gone the other way if about 40,000 voters in just three states had chosen Clinton over Trump.

No, it wasn’t meant to be this way. This was not what Silicon Valley envisaged when it set out to build “a planet where everything is connected” – the motto of Eric Schmidt’s foundation.

But then Luther didn’t set out to cause 130 years of bloody religious warfare either.

© Niall Ferguson 2017

Extracted from The Square and the Tower: Networks, Hierarchies and the Struggle for Global Power by Niall Ferguson

Share