The UN Human Rights Council recently initiated an expert consultation on the practical application of the Guiding Principles on Business and Human Rights (UNGPs) to the activities of technology companies, and it sought formal input from stakeholders.
This consultation is very timely given the increasing relevance of technology to the realization of human rights in practice and the prominent role of the private sector in how technology is designed, developed, and deployed.
Over the past two decades, BSR has gained significant experience working with technology companies on human rights due diligence, including around 100 human rights assessments.
We have also led collaborative efforts (such as facilitating the creation of the Global Network Initiative, Responsible Business Alliance, and Technology Against Trafficking) and published research (such as Human Rights-Based Approaches to Content Governance in the Social Media Industry).
We’ve drawn upon this experience to make a submission to the consultation, which is available in full here. We hope the submission provides a thoughtful contribution to the application of the UNGPs to the activities of technology companies. The key points are summarized below:
Human Rights Risks in Business Models
The need to understand the human rights impacts arising from company business models has rightly grown in focus over recent years. However, human rights due diligence practitioners can further define what human rights due diligence of business models means in practice, and the venture capital industry—an essential gatekeeper for technologies making it to market—requires more attention.
Human Rights Due Diligence and End-Use
BSR’s engagements with companies on human rights due diligence has taken a variety of forms, including geography (e.g., market-entry, exit, ongoing presence), products (e.g., entire platforms, new features, product research), customers (e.g., industry verticals, specific customers, use cases), and governance (e.g., product policies, mergers and acquisitions).
This diversity leads us to caution against a “one-size-fits-all” approach and instead to appreciate the value of tailoring approaches to secure maximum traction across a wide range of functions—such as engineering, product management, policy, and sales.
- Users play a significant role in shaping impact, and a considerable challenge when assessing human rights impacts is the interplay between the design of the product by the technology company and its real-life application. For more, see BSR’s report on downstream human rights due diligence.
- While today’s human rights assessments are typically undertaken for a single company, solutions are often more effective at the system or sector level. All industries deploying technology are relevant—not just technology companies—and business in other industries should be undertaking human rights due diligence around how they deploy technology.
- The quality of human rights due diligence improves significantly when it draws upon insights from a range of professional communities—including business and human rights teams, product managers, research and design teams, and sales and marketing teams—using a “human rights by design” approach.
- More engagement is required with non-users because technology can impact rightsholders who do not use the product in question. For example, hate speech on social media can be associated with real-world harm.
Accountability and Remedy
We believe that companies should be more transparent about the results of human rights due diligence and envision an ideal where companies publish insights as part of their overall “sustainability” disclosures. The recent integration of the UNGPs into the Global Reporting Initiative Universal Standards and the draft EU sustainability reporting standard are encouraging in this regard. Furthermore, we believe that access to remedy in a business-to-business (B2B) and business-to-government (B2G) context needs exploring. When undertaking human rights assessments in B2B and B2G settings (e.g., a cloud services company providing AI products to financial services companies), we’ve explored whether the AI vendor should “require” the buyer to set up reporting channels, if the AI vendor should have its own mechanism, and how responsibility to provide a remedy should be distributed across a complex web of vendors, systems integrators, and customers.
The State's Duty to Protect
Over recent years, governments have increasingly proposed and implemented regulations that are relevant for human rights in the technology industry.
We believe that government regulations of relevance to human rights due diligence—such as the General Data Protection Regulation, Digital Services Act, and AI Act—should be consistent with the UNGPs and are interoperable. For example, we recommend reinforcing the message that all human rights are potentially relevant for technology companies and that human rights due diligence is essential, rather than pre-determining certain technologies as inherently high or low risk.
We are also concerned about the growth of regulatory proposals from governments that would bring adverse human rights impacts; such as efforts seeking to establish liability for “lawful but awful” content (which will result in overbroad restrictions on freedom of expression), attacks on the use of end-to-end encryption (which is essential to protect rightsholders, especially human rights defenders, children, and other vulnerable users), and data localization laws (which can limit cross-border communication and present severe privacy risks).
Writing the BSR submission presented a timely opportunity for reflection, and we hope it provides a useful analysis of both the current state and future direction of human rights due diligence in the technology industry.