The increasingly pervasive use of technology in our everyday lives has triggered a spirited debate on how new and disruptive technologies—such as artificial intelligence (AI), robotics, internet of things (IoT), 5G, blockchain, quantum computing, autonomous vehicles, biotechnology, and nanotechnology—should be managed and governed.
To contribute to the evolving dialogue, BSR has partnered with the World Economic Forum (WEF) Centre for the Fourth Industrial Revolution to publish Responsible Use of Technology and offer a framework for the responsible use of disruptive technologies. This framework is intended to connect the practical steps companies can take to address responsible use issues systematically with the underlying concepts of ethical thinking and international human rights standards.
Over the past few years, a debate has arisen about the relative merits of ethics-based and human rights-based approaches when it comes to the responsible use of disruptive technologies. We believe this is a false choice. In the paper, we begin by making the case that responsible use of disruptive technologies will benefit from both approaches. Ethics and human rights approaches should not be thought of as oppositional but rather as two synergistic approaches for the responsible use of technology: A human rights-based approach provides a universal foundation upon which various ethical frameworks, choices, and judgments can be applied. Over the coming months, by drawing upon lessons learned working with BSR’s member companies, we plan to set out more detail about how this combination can be implemented in practice.
“Ethics and human rights approaches should not be thought of as oppositional but rather as two synergistic approaches for the responsible use of technology—a human rights-based approach provides a universal foundation upon which various ethical frameworks, choices, and judgments can be applied.”
Using these approaches, companies can take action to encourage the responsible use of disruptive technologies and mitigate the risk of irresponsible use. However, in the paper, we make the case that this will require the active participation of companies from across the whole technology value chain—not just technology companies, but all industries deploying disruptive technology—as well as governments, civil society, and other stakeholders. The paper lays out the three phases of the technology value chain (design and development, deployment and sale, and use and application) and presents the necessary actions and key questions to address at each phase.
Finally, our paper describes how companies can take action to prevent or mitigate adverse human rights impacts that we have most commonly identified during real-world human rights and ethics due diligence engagements with BSR member companies.
Companies that design, develop, deploy, or sell disruptive technology—the vendors—have at least three main courses of action available to prevent or mitigate adverse human rights impacts.
- They can set and implement limits on what customers can or cannot do with disruptive technologies by establishing “acceptable use policies” covering impacts relevant for that technology, such as privacy, surveillance, or discrimination.
- They can define who they will or won’t sell to by creating whitelists (approved customers) or blacklists (blocked customers).
- They can reduce the likelihood of product misuse by sharing guidance, training, and best practices with others.
Companies that buy, use, and apply disruptive technology—the customers—also have at least three main courses of action available to prevent or mitigate adverse human rights impacts.
- They can make proactive attempts to understand the real impacts arising from their use of disruptive technology by undertaking human rights impact assessments for their own use cases.
- They can make judgment calls and choices about how the disruptive technology will be used to avoid, prevent, or mitigate impacts by acting upon the findings of the assessments.
- They can be deliberate in communicating their lessons learned about product misuse and abuse back to the vendor.
There are also courses of action to prevent or mitigate adverse human rights impacts that exist across the entire value chain.
- Companies can engage in proactive transparency to increase collective awareness of how a technology works with the aim of informing better decisions by others, such as users, governments, and partners.
- Companies can advocate for standards, policies, laws, and regulations from governments at all levels that define how technology should or should not be used.
- Companies can engage with a diverse range of stakeholders and deploy strategic foresight and futures methodologies (such as scenario planning) to anticipate adverse impacts that might otherwise go unnoticed.
It is noteworthy that all these suggested measures have their shortcomings. A vendor may establish acceptable use policies—on data privacy, for example—but not have the insight necessary to enforce them effectively. There may be circumstances where society doesn’t want companies to be deeply engaged with deciding who they do and don’t sell products to, such as communications infrastructure that enables freedom of expression. And there may be situations when transparency heightens the risks faced by vulnerable groups. These shortcomings emphasize the importance of taking system-wide approaches to the responsible use of disruptive technology and not relying on the actions of a single company or government alone to affect change.
WEF plans to use this paper and the dialogue it creates to inform the creation of implementation tools for organizations to advance responsible technology practices. This might include a responsible use decision framework, resources to improve the integration of both ethical and human rights-based approaches across roles and business functions, and playbooks for each stage of the technology product lifecycle. BSR will be an active participant in this work, and we welcome expressions of interest from BSR member companies seeking to learn more.