Tools and Weapons: The Promise and the Peril of the Digital Age

Date Reviewed
March 10th 2020


This book urges readers to beware, but not despair about the threats, dangers and complexities that accompany the rewards of the expanding opportunities of pervasive data.


Then the author proceeds to go though the history of reasons to despair and then some suggestions for how the dangers may be mitigated.


Although two authors are credited for the book, the entirety is in the voice of Smith, a lawyer by training and the president of Microsoft.


It is not surprising that much of the material he addresses is of a legal nature concerning laws and regulations.


While I recognize the importance of this, my eyes tended to glaze over often as I tried to follow along.


In his foreword to the book, Microsoft founder, Bill Gates, credits Smith with stepping outside a purely Microsoft oriented approach to deal with bigger global concerns. And that does come out in the book.


He got my attention by starting with a physical description of the huge server facilities located around the world to make up what is called the “cloud”, where progressively more of the world's data is stored.


While I admit he didn't always enthral me, he did so often enough so that I wasn't tempted to give up on the book.


And unlike some U.S. based writers aiming at a predominantly U.S. market, he does discuss what is going on in other parts of the world. And much of this is informed by regular participation in international conferences dealing with the issues he is taking in hand. It seems that global data companies are less provincial and more borderless than either the governments or population of their home countries would like.


The book is divided largely into 20-page chapters. While it may be only a technical issue, I find it makes dense non-fiction material easier to go through.


Smith's introduction launches by referring to the place ancient libraries played in the storage of “data”. Now data is pervasive to the degree that it continues to grow regardless of the vitality of the economy. As an example, he says, this decade will end with 25 times as much data as it began. He calls it a “renewable resource” that humans create.


He is not much critical of this. Unlike Soshanna Zuboff where in her book “The Age of Surveillance Capitalism” she warns of the exploitive, manipulative nature of the harvesting and use of data on the people supplying it.


Back to the data centres, which Smith describes as the world's biggest consumers of electricity. As well a huge flow of electricity, the facilities are backed up by huge generators for emergencies and batteries to modulate the flow to the servers that make up the cloud. Much of the heat generated by the servers is used to heat the building in cold periods. He describes an awesome concentration of hardware. He said they are fortified with physical and digital security such that they are locked storage containers


He says the data is encrypted to restrict access. Other security measures are described and data centres back each other up to keep the cloud functioning during localized crises. As an example Microsoft has about 100 data centres in 20 countries supporting more than a billion customers. He describes them as at the centre of the new digital age. True to the title, he calls these “a powerful tool and a formidable weapon”.


The concentration of data centres for the cloud to increase proximity was needed to avoid even the half second delay that distance might add, he says. Further as a global system even more attention had to be paid to different cultures and standards of privacy.


There is a concentration of data centres in Ireland to serve Europe. He says that for human rights issues where a data centre will be placed is a careful decision.


And as such is a focus of both tension and anxiety, most pronounced in the democracies. And herein lies the major thesis of the book.


To meet the concerns of maintaining a broad social and economic consensus both self and government regulation is needed. He says that governments are reluctant to move, but they must move faster and catch up with the pace of technology.


He points out the incongruity where governments are serving people within geographical constraints while tech has gone global making regulation more complicated.


While Smith doesn't get immersed in party politics he found President Barack Obama's grasp of constitutional law beyond his, but not so much that there couldn't be a conversation. There is little mention of conversing with the current (Trump) White House and one can imagine certain frustrations. “We also faced the added complexity of the tech sector's complicated relationship with the Trump White House.” Some of which involved immigration battles . This shouldn't be a surprise given that tech. companies involve education, training and expertise as a premise and much of this comes from immigrants..


In litigation, he relays that a tech company should fight the cases it deserves to win and settle the ones it deserves to lose. But then you need somebody who could tell the difference. But a caveat, he mentions, is that it is more fun to fight a battle but “typically more rewarding to strike a deal”.


The development of the 'Cloud' makes expedient law enforcement, protection of privacy and human rights and countries borders, more important.


Smith points out that old versions of programs with security that has not kept pace with technology are more vulnerable to hacking. And compounding the danger is that most people cannot see that their digital equipment is dangerously outdated, where they may with more obvious physical technology.


At the same, time people may resist upgrades as a marketing ploy to sell more and make their technology more rapidly obsolete.


As such, says Smith, companies are trying to create a financial incentive to upgrade. For emergencies the company sometimes can offer 'patches' for a specific issue.


He speaks generally but ominously about the danger of cyberweapons and what services could be impaired, if not shut down, suddenly. The ubiquity of computer controlled services makes much of modern life vulnerable.


And this is leading to more government action to create their own weapons in a cyberweapons race. Concern about this has a generational divide with the younger more concerned than the older.


At the same, time governments are moving slowly to try to regulate in a way that might lessen vulnerability, he adds.


He described “weaponized email” attacks by the Russians on Hillary Clinton's 2016 campaign and the Democratic party in general. He points to a similar attack in 2017 on France.


He said that because it involved the legitimacy of the 2016 presidential election all bipartisan attempts to solve the issue “went out the window”. In a sense, U.S. action was paralyzed.


While that incident directly attacked Democrats, subsequent hacks, he says have also targeted Republicans.


And one of the ironies of digital technology is that it has “made the world smaller” and people at distance more accessible, it “has cast a deafening silence between people sitting next to each other.” This is a century long trend accompanying technology that has connected people who lived apart.


In some cases, it has made people more vulnerable to disinformation campaigns, he adds. The Russians, he says, have been able to stir the American political pot with such campaigns.


And while the data companies have not deliberately facilitated this they have also not put in features to prevent or recognize it. Some of the more serious may be false audio and video used to attack and malign people.


While there is a broad agreement that regulation is needed, there is not much on what it should be.


Both terrorist and state sponsored exploitation of social media platforms are used to undermine social stability, says Smith, and hence the need for regulation.


But this is not the first example of this concern, as the radio in the 1930s was thought to distract children from reading and hence diminishing performance.


Concern then led to regulations requiring that radio stations also broadcast public interest programs and documentaries. Despite commercial protest these additions became conditions of radio licences. However, finding a comparable equivalent to an “editor” for the internet is new. Ensuring that a human or a robot is speaking is a verifiable aspect.


He conceded that the U.S. has used manipulative strategies in other countries that would not be permitted in the U.S. Spreading disinformation and disrupting democracy is the prime concern.


The Danish foreign minister described new data companies as a type of “new nation”.


Some people, Smith says, object to the idea that international companies would protect civilians on a global basis rather than help home governments attack other nations. A broad consensus of 500, including 65 countries and many of the tech companies, was reached in early 2019. However, many in the Trump White House were skeptical about multilateral agreements and the U.S. did not join.


Smith expects that nuclear arms negotiations of an early generation could be inspiration for what work is needed. He adds that constraining the use of technology is more plausible than trying to ban it. At the same time, governments are tempted by the use of cyberweapons where detection is difficult.


But data companies, as “nations” see opportunities to forge their own international agreements. The energy and ambitions tied up in these companies may complicate getting agreements. In addition, competitors in the tech field may be reluctant to publicly support a company going through its own legal difficulties.


But by early 2019, there was some consensus for a cybersecurity tech agreement. However, governments weren't ready to come on board. But even before Trump, the U.S. was lagging on international co-operation of data security regulations.


Personal information may move from data centre to data centre depending on needs and doesn't necessarily stay in one country. And the economy depends on data moving between countries.


Europe, he says, is farthest ahead in regulation related to privacy and it may partly be because Europe is most concerned with privacy.


Smith called the harvesting of personal data from Facebook customers by the political consulting firm Cambridge Analytica, the privacy equivalent of 'Three Mile Island'. It was to build a data base for advertisements to support Donald Trump's presidential campaign.


And “commercial surveillance” of our online searches, communication, digital location, purchases, and social media tell more about us than we probably want to share.” Here he is hinting in the direction of Zuboff's primary concern.


Further, he says, he doesn't believe that privacy will die the quiet death that some in the tech sector believed 10 years or more ago. He foresees that the privacy issue in the U.S. will move in the direction of Europe and he expects it will even effect China.


He calls broadband “the electricity of the 21st century” in terms of its importance and illustrates it by suggesting that medicine may soon mean telemedicine. And companies locate partly on access to broadband. Extending broadband is a social cause, given the services related.


And a general tip, in negotiations never let it come down to one issue, where a winner and loser are guaranteed. Keep other issues open so that wishes can still be traded.


Smith discusses issues around talent. And like the unequal spread of broadband service there is an unequal distribution of the opportunity to gain talent. He added that large technology firms that attract and pay talented people well, at the same time cause social dislocation. The high wages of these people compete for the available housing often putting it out of reach of people who would normally live in the community. San Francisco is an example of these issues.


It is increasingly apparent, he says, that AI technologies need to be guided by strong ethical principles to better serve society.


Vision and speech recognition are two factors central in AI progress. Currently computers, he says, are now approaching 100 per cent, where humans are at about 96 per cent. You wouldn't know it from some phone answering situations.


He points to three technological advances that will boost AI rapidly. They are computing power to perform the massive number of calculations. Cloud computing takes large amounts of power and storage capacity without huge investments in hardware. And massive data sets are available to train AI based systems.


With these having reached a certain level, now AI can begin to move into cognition, the next frontier.


He returns to emphasizing how important diversity of researchers and engineers in the field is so technology can reflect the diversity of needs and values in the world. And it will become more apparent as computers begin to make decisions previously reserved for humans.


One of the big questions is who are computers and designers of them accountable to, he adds.


He indicates that in a few cases employees of tech. companies (Google and Microsoft) have refused to work on military contracts that the companies committed to.


So far, he says, military powers have resisted international rules limiting cyberweapons. Hashing out property laws around AI may prove easy compared to laws motivated by ethics, surrounding accountability, public transparency, individual privacy and fairness.


Related to this, on the education front, Smith believes all computer and data scientists be required to be exposed to liberal arts and liberal arts specialists be exposed to computer and data science.


As AI becomes better seeing (facial recognition) and better hearing, privacy issues will come to the fore, possibly as simply as face recognition of customers going into stores, he adds.


He tells a story of a quandary where a country without an independent judiciary and a poor record of human rights wanted to buy facial recognition software from Microsoft and employ cameras across the capital city. The company didn't do it. They also refused a California police force request on the grounds that the recognition was too prone to errors. He says the facial recognition issue gives an example of ethical challenges facing AI.


However, they were concerned that social responsibility would compete with market as success as others sold half baked systems which would be made better by usage and more data.


Lee Kai Fu in his book, AI Superpowers, said that American companies tried to get perfection before releasing the technology where the Chinese released a system that had functionality and tried to improve it during usage.


Through this book Smith is describing the evolution at Microsoft where in the 1990s it was battling regulation as unnecessary, but is now advocating it.


But he says once a company has reliable answers to the critical questions it should build and release the product and get real-world feedback. This approach has helped technology move ahead faster.


He doesn't miss pointing out the almost certain and imminent loss of jobs, maybe even industries to AI. Some maybe replaced as soon as it is technologically possible, others may be delayed for political and social reasons. Advanced degrees and sophisticated skills are no guarantee of protection.


Progressively more work time is devoted to meetings and other forms of communication. And this collaboration will be an area to be effected by AI later. AI may not be good at jobs focussing on empathy. And then there will be new jobs maybe already imagined or not.


Smith says engineers are doing a fairly good job of predicting where AI will go, but are much less successful in predicting time frames for these changes often overestimating for the next two or three years and under, within a decade. He referred to a “ripening period” for technology to explain the gap. During that period their may be a confluence of reinforcing technologies. He illustrated this with the jump made by the i-phone from the myriad of cell phones used in the decade earlier.


He speculated that the self driving car may be waiting for a confluence of technology.


He gave the 'jump' from horse to car as an example of a change that occurred over a decade 1910 to 1920. He also illuminates how deeply horse culture was imbedded in society with much farm effort devoted to food for horses. And that shift was a contributing factor in farm revenue and the great depression.


Smith speculates that the transition to AI may be comparably disruptive economically and socially (cultural values and societal choices). And although fearful, he doesn't rule out positive surprises in the transition. One such change from the evolution to cars was the expansion of consumer credit and advertising, although it could be argued that these are two edged swords.


Smith talks about the way China has used its technology sector to control its market for its businesses. Apple with its i-phone is the only American tech company to thrive in China. And given the potential for technology transfer the U.S. is not sure it wants success there. In addition, American companies may not be well tuned into Chinese tech needs and wants.


Also Chinese tech. products are more competitive with American than those from other countries. And then there is the ultimate dichotomy between China's public order and the U.S. human rights.


A further difference, he indicated, is that the west, in this case Silicon valley, is motivated by the attitude of pouring effort into something to advance it. He contrasted this with the Chinese belief that things move in circles and return to starting points. As such they have a broader perspective, he adds.


Data of most concern kept in the cloud, he says include facial recognition and consumer data.


While some might like to keep knowledge separate, new discoveries of basic ideas often come in the academic fields and academicians race to post their finding on open access sites.


He points out that if both China and the U.S. cast aspersions on each other's technology other countries may believe both and look for other sources. As a result both might find it prudent to keep most technology open. At present, the world is almost divided in half with respect to technology.


And what might be another political chastisement, Smith says “if the United States is going to navigate global challenges, it will need leaders who understand the world.”


He refers to Lee Kai-Fu's (AI Superpowers) where data is referred to as the “rocket fuel” for AI, a similar contention of Zuboff. Neither of these three share the same danger concerns


Lee predicts that it will lead to greater concentration of power and ultimately wealth in every sector of the economy. Zuboff certainly has this worry.


Data, says Smith, leads the triumvirate of cloud based computer power and algorithms as the key element going forward.


Smith's answer is to democratize AI and data. And data is like a renewable resource continually increasing. Finite products are gone when used. Data can be reused multiple times for different purposes. And the key is that it remain free for use. Hoarding data increases the need to collect it again, but hoarding is encouraged by profit motives.


He speculates on “open data” as there was “open-source code” for software.


He speaks to the issue of sharing data without giving up ownership of it. But also broader use of data will also be demanded for machines and not just humans. On this matter Smith seems to be saying everything right and asking the right questions, but ultimately decisions can be about money and that is what worries Zuboff.


Smith says that AI is unlike singular inventions of the past (telephone, car, computer) and is more like electricity which powers tools and devices integral to the current society. And such pervasiveness could lead to this conundrum. As cars go from 20 per cent of the value in electronics in 2000 to an estimated 50 per cent by 2030, the question could be will car makers be the chief manufacturers or will the tech companies.


He doesn't advocate that technology be blunted, but that more effort is put into managing. And unlike technology of the past, regulations need to be brought forward to control AI as it evolves. And he puts much of this proactive responsibility into the hands of technology. And with that a “more principled approach by the industry is called for”.


More than any other technology before, AI's reach is global and beyond any government and connectivity is its strength. Since it is beyond single governments control demands intergovernmental co-operation. Unfortunately this need comes when governments are turning more inward. Ironically the “inexorable course of technology is forcing more international collaboration.”


But there is little global unanimity about privacy, free expression and human rights. He suggests that democracies, in a lull of vigour, need new collaboration in the managing of technology and its impact.


Smith advises that the tech companies focus on what kind of regulation would be sensible rather than trying to prevent it. His concern is that the world will not do enough and governments will move too slowly.