Digital Technology’s Hidden Risks - Barron's

We’ve Entered a World of Digital Risk We’re Only Beginning to Understand
Photograph by Eitan Abramovich/AFP/Getty Images

“Now I am become Death, the destroyer of worlds,” said J. Robert Oppenheimer, quoting the Bhagavad Gita. It was July 16th, 1945, and he had just witnessed the world’s first nuclear explosion as it shook the plains of New Mexico. It was clear that as a species we had crossed a Rubicon and there was no turning back.

In the years and decades that followed, many scientists, including Oppenheimer as a leading voice, became activists. In 1955, a manifesto was issued that highlighted the dangers of nuclear weapons, which was signed by 10 Nobel Laureates. Later, a petition signed by 11,000 scientists helped lead to the Partial Test Ban Treaty.

More recent advancements have seen little of the same reverence for the power of technology. Silicon Valley’s famously libertarian culture has generally eschewed moral judgments about its inventions. Yet as technology grows in power, the moral aspects of technology are beginning to having an impact we can no longer ignore.

It’s become increasingly clear, for example, that data bias can vastly distort decisions about everything from whether we are admitted to a school, get a job, or even go to jail. With the rise of technologies like self-driving cars, decisions like whether to protect the life of a passenger or a pedestrian will need to be explicitly encoded into the systems we create. But we’ve yet to achieve any real clarity about who should be held accountable for decisions an algorithm makes.

Recent events – from Facebook ’s fall from grace to the hacking of social media platforms during the 2016 election to the increasing frequency of data breaches -- shows how exposed digital technology is to risk. What may be expedient for the current fiscal quarter may have tremendous financial impact down the road, when the impact a technology has on society becomes clearer.

Consider CRISPR, the gene editing technology that is revolutionizing life sciences and has the potential to cure terrible diseases such as cancer and Multiple Sclerosis. We already have seen the problems hackers can create with computer viruses; how would we deal with hackers creating new biological viruses?

But skeptics and Luddites are wrong to blame our challenges on technological advancement itself. The problem is ourselves. New technologies always have unanticipated consequences; but the lack of will to create solutions is wholly on us.

We can do better. Consider the case of Paul Berg, a pioneer in the creation of recombinant DNA, for which he won the Nobel Prize. Unlike Zuckerberg, he recognized the gravity of the Pandora’s box he had opened and convened the Asilomar Conference to discuss its dangers, finally calling a moratorium on the riskiest experiments until those dangers were better understood. In her book, A Crack in Creation, Jennifer Doudna, who made the pivotal discovery for CRISPR gene editing, points out that a crucial aspect of the conference was that it invited not only scientists, but also lawyers, government officials and media. It was the dialogue between a diverse set of stakeholders.

The issues we will have to grasp over the next few decades will be far more complex and consequential than anything we have faced before. Nuclear technology, while horrifying in its potential for destruction, requires a tremendous amount of scientific expertise to produce it. Newer technologies are cheaper and far more accessible, and will require more people to show better judgment.

History clearly shows that this is possible. Just like Oppenheimer, Einstein and the other nuclear scientists as well as early gene editing pioneers like Paul Berg, we can mitigate risks and maximize gains at the same time. Responsible technological advancement doesn’t mean fewer profits. In fact, in the long term, it means more.

Some seem to grasp this. Most of the major tech companies have joined with the ACLU, UNICEF, and other stakeholders to form the Partnership On AI to create a forum that can develop sensible standards for artificial intelligence. Salesforce recently hired a Chief Ethical and Human Use Officer. Doudna has begun a similar process for CRISPR at the Innovative Genomics Institute. But these are little more than first steps.

The philosopher Martin Heidegger argued that technological advancement is a process of revealing and building. We can’t control what we reveal through exploration and discovery, but we can—and should—be wise about what we build. If you just “move fast and break things,” don’t be surprised if you break something important.

Greg Satell is an author, speaker, and innovation adviser whose latest book, Cascades: How to Create a Movement that Drives Transformational Change, will be published in April, 2019. His previous book, Mapping Innovation, was selected as one of the best business books of 2017.



http://bit.ly/2Rv87rP

Related Posts :

0 Response to "Digital Technology’s Hidden Risks - Barron's"

Post a Comment