Tech causes more problems than it solves

A number of respondents to this canvassing about the likely future of social and civic innovation shared concerns. Some said that technology causes more problems than it solves. Some said it is likely that emerging worries over the impact of digital life will be at least somewhat mitigated as humans adapt. Some said it is possible that any remedies may create a new set of challenges. Others said humans’ uses and abuses of digital technologies are causing societal harms that are not likely to be overcome.

overview insights:

The problem is that we are becoming more and more dependent on machines and hence more susceptible to bugs and system failures.

YAAKOV J. STEIN

Larry Masinter, internet pioneer, formerly with Adobe, AT&T Labs and Xerox PARC, who helped create internet and web standards with IETF and W3C, said, “Technology and social innovation intended to overcome the negatives of the digital age will likely cause additional negative consequences. Examples include: the decentralized web, end-to-end encryption, AI and machine learning, social media.”

James Mickens, associate professor of computer science at Harvard University, formerly with Microsoft, commented, “Technology will obviously result in ‘civic innovation.’ The real question is whether the ‘innovation’ will result in better societal outcomes. For example, the gig economy is enabled by technology; technology finds buyers for workers and their services. However, given the choice between an economy with many gig workers and an economy with an equivalent number of traditional middle-class jobs, I think that most people would prefer the latter.”

Yaakov J. Stein, chief technology officer of RAD Data Communications, based in Israel, responded, “The problem with AI and machine learning is not the sci-fi scenario of AI taking over the world and not needing inferior humans. The problem is that we are becoming more and more dependent on machines and hence more susceptible to bugs and system failures. This is hardly a new phenomenon – once a major part of schooling was devoted to, e.g., penmanship and mental arithmetic, which have been superseded by technical means. But with the tremendous growth in the amount of information, education is more focused on how to retrieve required information rather than remembering things, resulting not only in less actual storage but less depth of knowledge and the lack of ability to make connections between disparate bits of information, which is the basis of creativity. However, in the past humankind has always developed a more-advanced technology to overcome limitations of whatever technology was current, and there is no reason to believe that it will be different this time.”

A vice president for research and economic development wrote, “The problems we see now are caused by technology, and any new technological fixes we create will inevitably cause NEW social and political problems. Attempts to police the web will cause freedom of speech conflicts, for example.”

Misinformation – pervasive, potent, problematic

Numerous experts described misinformation and fake news as a serious issue in digital spaces. They expressed concern over how users will sort through fact and fiction in the coming decade.

Stephanie Fierman, partner, Futureproof Strategies, said, “I believe technology will meaningfully accelerate social and civic innovation. It’s cheap, fast and able to reach huge audiences. But as long as false information is enabled by very large websites, such social and civic innovators will be shadow boxing with people, governments, organizations purposely countering truthful content with lies.”

An expert in the ethics of autonomous systems based in Europe responded, “Fake news is more and more used to manipulate a person’s opinion. This war of information is becoming so important that it can influence democracy and the opinion of people before the vote in an election for instance. Some AI tools can be developed to automatically recognize fake news, but such tools can be used in turn in the same manner to enhance the belief in some false information.”

A research leader for a U.S. federal agency wrote, “At this point in time, I don’t know how we will reduce the spread of misinformation (unknowing/individual-level) and disinformation (nefarious/group-level), but I hope that we can.”

A retired information science professional commented, “Dream on, if you think that you can equate positive change with everybody yelling and those with the most clout (i.e., power and money) using their power to see their agendas succeed. Minority views will always be that, a minority. At present and in the near future the elites manipulate and control.”

Privacy issues will continue to be a hot button topic

Multiple experts see a growing need for privacy to be addressed in online spaces.

Ayden Férdeline, technology policy fellow at the Mozilla Foundation, responded, “Imagine if everyone on our planet was naked, without any clear options for obtaining privacy technology (clothing). It would not make sense to ask people what they’d pay or trade to get this technology. This is a ‘build it and they will come’ kind of scenario. We’re now on the verge, as a society, of appropriately recognizing the need to respect privacy in our Web 2.0 world, and we are designing tools and rules accordingly. Back in 1992, had you asked people if they’d want a free and open internet, or a graphical browser with a walled garden of content, most would have said they prefer AOL. What society needed was not AOL but something different. We are in a similar situation now with privacy; we’re finally starting to grasp its necessity and importance.”

Graham Norris, a business psychologist with expertise in the future of work, said, “Privacy no longer exists, and yet the concept of privacy still dominates social-policy debates. The real issue is autonomy of the individual. I should own my digital identity, the online expression of myself, not the corporations and governments that collect my interactions in order to channel my behaviour. Approaches to questions of ownership of digital identity cannot shift until the realization occurs that autonomy is the central question, not privacy. Nothing currently visible suggests that shift will take place.”

Eduardo Villanueva-Mansilla, an associate professor of communications at Pontificia Universidad Catolica, Peru, and editor of the Journal of Community Informatics, wrote, “I’m trying to be optimistic, by leaving some room to innovative initiatives from civic society actors. However, I don’t see this as necessarily happening; the pressure from global firms will probably too much to deal with.”

Jamie Grady, a business leader, wrote, “As technology companies become more scrutinized by the media and government, changes – particularly in privacy rights – will change. People will learn of these changes through social media as they do now.”

Technology use often disconnects or hollows out community

Some respondents commented on rising problems with a loss of community and the need for more-organic, in-person, human-to-human connection and the impact of digital distancing.

Jonathan Grudin, principal researcher at Microsoft, commented, “Social and civic activity will continue to change in response to technology use, but will it change its trajectory? Realignments following the Industrial Revolution resulted from the formation of new face-to-face communities, including union chapters, community service groups such as Rotary Club and League of Women Voters, church groups, bridge clubs, bowling leagues and so on. Our species is designed to thrive in modest-sized collocated communities, where everyone plays a valued part. Most primates become vulnerable and anxious when not surrounded by their band or troop. Digital media are eroding a sense of community everywhere we look. Can our fundamental human need for close community be restored or will we become more isolated, anxious and susceptible to manipulation?”

Rebecca Theobald, an assistant research professor at the University of Colorado, Colorado Springs, said, “Technology seems to be driving people apart, which would lead to fewer connections in society.”

A researcher based in North America predicted a reining in of the digital in favor of the personal: “Between email and phones, I think we’re close to peak screen time, a waste of time, and it’s ruining our eyes. Just as we have forsaken our landlines, stopped writing letters, don’t answer our cellphones, a concept of an average daily digital budget will develop, just as we have a concept of average daily caloric intake. We’ll have warning labels that rate content against recommended daily allowances of different types of content that have been tested to be good for our mental health and socialization, moderately good, bad, and awful – the bacon of digital media. And people who engage too much will be in rehab, denied child custody and unemployable. Communities, residences and vacation areas will promote digital-free, mindfulness zones – just as they have quiet cars on the train.”

Society needs to catch up and better address the threats and opportunities of tech

Some of these experts said that the accelerating technological change of the digital age is making it difficult for humans to keep up and respond to emerging challenges.

A chair of political science based in the American South commented, “Technology always creates two new problems for every one it solves. At some point, humans’ cognitive and cooperative capacities – largely hard-wired into their brains by millennia of evolution – can’t keep up. Human technology probably overran human coping mechanisms sometime in the later 19th century. The rest is history.”

There is a gap between the rate at which technology develops and the rate at which society develops. We need to take care not to fall into that gap.

LOUISA HEINRICH

Larry Rosen, a professor emeritus of psychology at California State University, Dominguez Hills, known as an international expert on the psychology of technology, wrote, “I would like to believe that we, as citizens, will aid in innovation. Smart people are already working on many social issues, but the problem is that while society is slow to move, tech moves at lightning speed. I worry that solutions will come after the tech has either been integrated or rejected.”

Bulbul Gupta, founding adviser at Socos Labs, a think tank designing artificial intelligence to maximize human potential, responded, “Until government policies, regulators, can keep up with the speed of technology and AI, there is an inherent imbalance of power between technology’s potential to contribute to social and civic innovation and its execution in being used this way. If technology and AI can make decisions about people in milliseconds that can prevent their full social or civic engagement, the incentive structures to be used toward mitigating the problems of the digital age cannot then be solved by technology.”

Gene Policinski, a journalist and First Amendment law expert at the Freedom Forum Institute, observed, “We forget how new the ‘tech revolution’ really is. As we move forward in the next decade, the public’s awareness of the possibilities inherent in social and civic innovation, the creativity of the tech world working with the public sector and public acceptance of new methods of participation in democratic processes will begin to drown out and eventually will surpass the initial problems and missteps.”

Gabriel Kahn, former bureau chief for The Wall Street Journal, now a professor of journalism researching innovation economics in emerging media at the University of Southern California, wrote, “We are not facing a ‘Terminator’-like scenario. Nor are we facing a tech-driven social utopia. Humans are catching up and understanding the pernicious impact of technology and how to mitigate it.”

Kathee Brewer, director of content at CANN Media Group, predicted, “Much like society developed solutions to the challenges brought about by the Industrial Revolution, society will find solutions to the challenges of the Digital Revolution. Whether that will happen by 2030 is up for debate. Change occurs much more rapidly in the digital age than it did at the turn of the 20th century, and for society to solve its problems it must catch up to them first. AND people, including self-interested politicians, must be willing to change. Groups like the Mozilla Foundation already are working on solutions to invasions of privacy. That work will continue. The U.S. government probably won’t make any major changes to the digital elections framework until after the 2020 election, but changes will be made. Sadly, those changes probably will result from some nastiness that develops due to voters of all persuasions being unwilling to accept electoral results, whatever the results may be.”

Valerie Bock of VCB Consulting, former Technical Services Lead at Q2 Learning, responded, “I think our cultures are in the process of adapting to the power our technologies wield, and that we will have developed some communal wisdom around how to evaluate new ones. There are some challenges, but because ordinary citizens have become aware that images can be ‘photoshopped’ the awareness that video can be ‘deepfaked’ is more quickly spreading. Cultural norms as well as technologies will continue to evolve to help people to apply more informed critiques to the messages they are given.”

Bach Avezdjanov, a program officer with Columbia University’s Global Freedom of Expression project, said, “Technological development – being driven by the Silicon Valley theory of uncontrolled growth – will continue to outpace civic and social innovation. The latter needs to happen in tandem with technological innovation, but instead plays catch-up. This will not change in the future, unless political will to heavily regulate digital tools is introduced – an unlikely occurrence.”

Despite current trends, there is reason to hope for better days

Many of the experts in this canvassing see a complicated and difficult road ahead, but express hope for the future.

Cheryl B. Preston, an expert in internet law and professor at Brigham Young University Law School, said, “Innovation will bring risk. Change will bring pain. Learning will bring challenges. Potential profits will bring abuse. But, as was the decision of Eve in the Garden of Eden, we need to leave the comfortable to learn and improve. If we can, by more informed voting, reduce the corruption in governmental entities and control corporate abuse, we can overcome difficulties and advance as a society. These advances will ultimately bring improvement to individuals and families.”

Hume Winzar, an associate professor and director of the business analytics undergraduate program at Macquarie University, Sydney, Australia, predicted, “With more hope than evidence, I’d like to think that reason will eventually overcome the extraordinary propaganda machines that are being built. When the educated upper-middle classes realise that the ‘system’ is no longer serving them, then legal and institutional changes will be necessary. That is, only when the managers who are driving the propaganda machine(s) start to feel that they, personally, are losing privacy, autonomy, money and their children’s future, then they will need to undermine the efforts of corporate owners and government bureaucrats and officials.”

Carolyn Heinrich, a professor of education and public policy at Vanderbilt University, said, “My hope (not belief) is that the ‘techlash’ will help to spur social and civic innovations that can combat the negative effects of our digitization of society. Oftentimes, I think the technology developers create their products with one ideal in mind of how they will be used, overlooking that technology can be adapted and used in unintended and harmful ways. We have found this in our study of educational technology in schools. The developers of digital tools envision them as being used in classrooms in ‘blended’ ways with live instructors who work with the students to help customize instruction to their needs. Unfortunately, more often than not, we have seen the digital tools used as substitutes for higher-quality, live instruction and have observed how that contributes to student disengagement from learning. We have also found some of the content lacking in cultural relevance and responsiveness. If left unchecked, this could be harmful for far larger numbers of students exposed to these digital instructional programs in all 50 states. But if we can spur vendors to improve the content, those improvements can also extend to large numbers of students. We have our work cut out for us!

Yoshihiko Nakamura, a professor of mechno-informatics at the University of Tokyo, observed, “The current information and communication technology loses diversity because it is still insufficient to enhance the affectivity or emotion side of societies. In this sense I can see the negative side of current technology to human society. However, I have a hope that we can invent uses of technology to enhance the weaker side and develop tomorrow’s technology. The focus should be on the education of society in the liberal arts.”

Ryan Sweeney, director of analytics at Ignite Social Media, commented, “In order to survive as a functioning society, we need social and civic innovation to match our use of technology. Jobs and job requirements are changing as a result of technology. Automation is increasing across a multitude of industries. Identifying how we protect citizens from these changes and help them adapt will be instrumental in building happiness and well-being.”

A technology developer active in IETF said, “I hope mechanisms will evolve to exploit the advantages of new tech and mitigate the problems. I want to be optimistic, but I am far from confident.”

A renowned professor of sociology known for her research into online communications and digital literacies observed, “New groups expose the error of false equivalence and continue to challenge humans to evolve into our pre-frontal cortex. I guess I am optimistic because the downside is pretty terrible to imagine. It’s like E.O. Wilson said: ‘The real problem of humanity is the following: We have paleolithic emotions; medieval institutions; and god-like technology. And it is terrifically dangerous, and it is now approaching a point of crisis overall.’”

In the field I follow, artificial intelligence, the numbers of professionals who take seriously the problems that arise as a consequence of this technology are reassuring.

PAMELA MCCORDUCK

Source:  Pewresearch