loader
Page is loading...
Print Logo Logo
AI

Insights

ESG, Blockchain, and AI – Oh My!

Highlights

AI and blockchain are gaining traction as innovative solutions to improve ESG performance and reporting come online. 

Together, they are being used to identify and mitigate ESG risks, improve efficiency, and enhance stakeholder engagement

Increased scrutiny of AI and blockchain is likely to push Congress to regulate these technologies 

To effectively incorporate AI and blockchain into ESG practices, companies will need to monitor risks and adapt quickly to the dynamic regulatory environment

As the importance of environmental, social, and governance (ESG) considerations grows, companies are seeking innovative solutions to improve their ESG performance and reporting. Two emerging technologies that are gaining traction are artificial intelligence (AI) and blockchain. AI can help companies analyze large amounts of ESG data to identify trends and opportunities, while blockchain can increase transparency and traceability in ESG reporting. 

Together, they are being used to identify and mitigate ESG risks, improve operational efficiency, and enhance stakeholder engagement. However, as with any new technology, there are also risks and challenges associated with their use. 

In recent months, both AI and blockchain technologies have been subject to increased scrutiny, largely driven by the explosion of generative AI, such as OpenAI's ChatGPT, which reached 100 million active users within a month of its launch in November 2022, and the high-profile implosion of blockchain-based cryptocurrencies. Because of these developments, as well as regulatory measures already taken at the state level and by other countries, it is widely anticipated that pressure on Congress to formally regulate AI and impose restrictions on blockchain technology will increase substantially. 

The iconic quote by Heraclitus, "change is the only constant in life," is particularly apt for the dynamic regulatory environment surrounding AI and blockchain. To incorporate use of these technologies into their ESG practices effectively, companies must be nimble, stay up-to-date with evolving regulations, and adapt quickly to this rapidly changing landscape.

Artificial Intelligence and ESG

Potential AI Benefits

AI can help companies automate the collection and analysis of ESG data from various sources, including social media, news outlets, and financial filings. This can help companies identify material ESG risks and opportunities, track performance, and provide insights to inform decision-making. Some s examples of potential benefits from the use of AI include the analysis of:

  • Unstructured data to identify patterns and trends related to environmental risks, social impacts, and governance issues, which can help companies better understand and mitigate those risks
  • Data on energy usage and emissions to help companies identify areas for improvement and development of more effective sustainability strategies
  • Satellite imagery and other environmental data to monitor a company's impact on the environment, such as tracking deforestation or water usage. This data can then be used to identify ways to reduce the company's overall environmental footprint.
  • Supplier data, including social and environmental impact, can help companies identify and mitigate ESG risks related to supply chain management, such as labor and human rights violations, fraud or corruption, or environmental non-compliance; and take action to mitigate those risks

Additionally, AI can be used to improve ESG reporting and disclosure. AI can help companies collect and analyze ESG data in a more standardized and consistent way, affording stakeholders access to timely, accurate and more reliable information that can improve transparency and accountability.

Potential AI Risks

While the use of AI algorithms to identify patterns and trends related to environmental risks, social impacts, and governance issues can be invaluable, there is concern about the potential for bias in AI algorithms themselves. This can be particularly problematic in the area of social issues, where algorithms may not be able to fully capture the complexity of human experiences and social dynamics. There are several potential sources of bias in AI algorithms:

  • Data selection bias, where the data used to train the algorithm is not representative of the population being assessed. For example, if an algorithm is trained on data from a particular region or industry, it may not be able to accurately assess companies operating in different regions or industries.
  • Algorithmic bias, where the algorithm itself is designed or programmed in a way that is biased. If the algorithm is designed with implicit or explicit biases, such as assumptions about certain ESG risks or how to weigh them, those biases will be reflected in the algorithm's assessments. For example, if an algorithm is designed with a bias towards financial metrics over social or environmental metrics, it may undervalue the importance of social or environmental risks in its assessments; and conversely, an algorithm biased in favor of financial factors may inadequately account for ESG risks.
  • Human bias, where the people developing and training the algorithm introduce their own biases into the data or the algorithm itself. For example, if the people training the algorithm are not diverse, they may not be able to account for certain biases or experiences that are relevant to ESG issues.

The potential risks associated with bias in AI algorithms for ESG are significant. Biased algorithms can lead to inaccurate assessments of companies' ESG performance, which can result in misallocations of capital or misguided investment decisions. Biased algorithms can also perpetuate or exacerbate existing social, environmental, and governance inequities. Several steps can be taken address and mitigate these risks:

  • Companies can ensure the data used to train the algorithms is comprehensive, representative, and free from bias. This may require incorporating multiple data sources and engaging with diverse stakeholders to identify relevant ESG risks and opportunities.
  • Companies can conduct regular audits of the algorithms to detect and address any bias in their design or programming. This may involve engaging with third-party experts to assess the algorithms' methodologies and identify areas for improvement.
  • Companies can ensure the algorithms are transparent and explainable, so that stakeholders can understand how the algorithms arrive at their assessments. This may involve providing clear documentation of the algorithms' design, programming, and training data, as well as establishing mechanisms for stakeholder feedback and input.

A further very real risk is the possibility of over-reliance on AI systems for decision-making, which may lead to a lack of human oversight and understanding of the limitations and biases of the system. It is crucial for companies to use AI as a tool to support human decision-making, rather than replace it. In the long term, a balanced approach to AI integration in ESG practices, combining the strengths of AI with the judgment of human experts, is necessary to realize the full potential of these technologies in driving sustainable and responsible business practices.

Blockchain and ESG

Potential Blockchain Benefits 

Blockchain technology has the potential to improve transparency and accountability in ESG reporting by providing a secure and immutable record that can securely record data and transactions related to ESG factors, such as carbon emissions, social impact, and governance practice. Benefits that can be derived from using blockchain technology include:

  • Blockchain can help address the issue of data accuracy and reliability as it can automate the process of ESG data aggregation and reconciliation, reducing the need for manual data entry and processing, which in turn should minimize the risk of errors and inconsistencies
  • Blockchain technology can facilitate stakeholder engagement and collaboration by providing a platform for multiple parties, including companies, investors, regulators, and non-governmental organizations, to access the same data and verify its accuracy in real-time. For example, blockchain offers a platform for accurate and reliable management of carbon trading and monitor carbon offsets worldwide by establishing a trustworthy record of the transactions and allowing all participants real time access to the same verified information about their credits and progress toward targets for greenhouse gas emission reductions.
  • Blockchain can improve accountability in ESG reporting by making it more difficult to manipulate data. Once a record has been added to the blockchain it cannot be altered or deleted without the agreement of all parties involved. This makes it more difficult for companies to misrepresent their ESG performance or engage in greenwashing, as their claims can be independently verified through blockchain data.
  • Using blockchain for ESG reporting can increase transparency and traceability in supply chains. A company can use blockchain to track the origin and supply chain of raw materials to ensure they are ethically and sustainably sourced and have a low environmental impact. By using a distributed ledger, stakeholders can track the progress of and changes to the materials as they move through the supply chain, ensuring transparency and accountability. This can also help companies proactively identify potential ESG risks and opportunities within their supply chains, and take action to address them.

Potential Blockchain Risks

While the benefits of using blockchain for ESG reporting and management are numerous, there are also potential risks and challenges that need to be considered. Some of the same characteristics that make blockchain appealing can also lead to concerns about privacy:

  • Immutable nature of data. Once data is stored on the blockchain, it cannot be altered or deleted. This can raise concerns about the privacy and security of sensitive information, as it cannot be removed or modified once it is added to the blockchain. This can also be problematic if the data contains errors or inaccuracies, or if the data becomes outdated or irrelevant.
  • Lack of centralized control. Because blockchain operates on a decentralized network, there is no central authority controlling access to the data. While this can increase security and transparency, it can also make it difficult to enforce privacy policies or protect sensitive data.
  • Potential for data breaches. While blockchains are generally considered to be secure, there is still the potential for data breaches or hacking attacks that could compromise the privacy of information stored on the blockchain.

Several steps can be taken to address and mitigate the risk of these privacy concerns.

  • Permissioned blockchains: Implementing a permissioned blockchain, where only authorized parties have access to the data, can prevent unauthorized access
  • Encrypt data: Companies can use encryption methods to protect sensitive information and data on the blockchain, such as personally identifiable information. Companies can also consider using privacy-enhancing technologies, such as zero-knowledge proofs or differential privacy, to protect sensitive information while still allowing stakeholders to verify the authenticity of data on the blockchain.
  • Use smart contracts: Smart contracts can be used to automate data privacy and protection, ensuring that data is only shared with authorized parties and only for specific purposes. This can be particularly useful in supply chain management, where data privacy and protection are critical.
  • Implement privacy policies and guidelines: These policies can outline the principles and practices for blockchain data privacy and protection, and ensure that all stakeholders are aware of the company's commitment to data privacy
  • Conduct regular security audits: Regular security audits can help identify and address potential vulnerabilities in the blockchain system, ensuring that data is secure and protected

Apart from privacy concerns, there are several other risks associated with using blockchain technology for ESG practices. Because blockchain technology is still relatively new, and its long-term viability and scalability have not yet been fully tested, there is potential for unintended consequences and negative impacts. For example, the automation of certain processes using blockchain technology (and AI) could result in the displacement of workers, leading to social and economic challenges. 

Additionally, the lack of standardization and interoperability in the use of blockchain could lead to fragmentation and inefficiencies in the market, making it difficult for companies to compare and benchmark their ESG performance against industry standards. The use of blockchain in ESG reporting may also require significant investment in technology and infrastructure, which may not be accessible to all companies, potentially creating a digital divide in the market. Blockchain is energy-intensive and its use could exacerbate environmental issues if not properly managed.

Overall, while there are risks and challenges associated with the use of blockchain in ESG reporting, the potential benefits of increased transparency, traceability, and accountability in supply chains, improved data accuracy and reliability, and enhanced stakeholder collaboration and engagement make it a valuable tool for companies seeking to enhance their ESG performance and reporting. As with any emerging technology, it is important to carefully consider and continually monitor the potential risks and take steps to address them to maximize the prospects that use of blockchain technology will improve ESG data management and reporting.

Regulatory Responses to AI and Blockchain

Regulation of AI

Currently, there are no federal laws in the United States that regulate AI or blockchain as they relate to ESG. However, in October 2022, the White House Office of Science and Technology Policy (OSTP) issued a Blueprint for an AI Bill of Rights that lays out five overlapping principles and associated guidance regarding automated, or artificial intelligence, systems that have the potential to meaningfully impact the American public’s rights, opportunities, or access to critical resources or services. These principles address the following topics: Safe and Effective Systems; Algorithmic Discrimination Protections; Data Privacy; Notice and Explanation; Human Alternatives, Consideration, and Fallback. According to the White House, “… these principles are a blueprint for building and deploying automated systems that are aligned with democratic values and protect civil rights, civil liberties, and privacy.”

This blueprint is not an enforceable law, nor is it a binding regulation. While it is voluntary and aspirational at present, it is to be followed by federal agencies in their design, implementation and operation of AI systems. At least two federal agencies separately issued their own AI guidance before the blueprint was published -- the Equal Employment Opportunity Commission (EEOC) in May 2022 and the Federal Trade Commission (FTC) in May 2021. In May 2022, the U.S. Department of Energy established its own AI Advancement Council.

Additionally, on Jan. 26, 2023, the National Institute of Standards and Technology (NIST) released its Artificial Intelligence Risk Management Framework. The goal of this framework “is to offer a resource to organizations designing, developing, deploying, or using AI systems to help manage the many risks of AI and promote trustworthy and responsible development and use of AI systems.” Like the blueprint, the NIST framework is voluntary.

A proposed law introduced in California in early February 2023 would align the state with the White House blueprint, but would also include enforcement mechanisms. Another bill recently introduced in Connecticut would establish an Office of Artificial Intelligence to assure that AI used by state agencies adheres to state and federal privacy and discrimination laws. Other states that have recently added laws or regulations focusing on particular AI issues or practices include Illinois, Maryland, New York, Colorado, and Virginia.

On the international front, the European Union’s (EU) Artificial Intelligence Act (AIA), approved by the Council of the EU on Dec. 6, 2022, and set to be considered by the European Parliament shortly, would regulate nearly all AI applications, products and services based on a tiered series of risks. In July 2022, the UK issued a policy paper, Establishing a pro-innovation approach to regulating AI, that focuses on flexible measures to use AI responsibly while reducing compliance burdens on businesses to boost the economy – a very different approach than the EU AIA. Over the past year, while the EU AIA, the UK bill, and the White House blueprint have been percolating, China has been rolling out a series of regulations targeting specific types of algorithms and AI capabilities.

On Jan. 27, 2023, the U.S. and the EU signed an Administrative Arrangement on Artificial Intelligence for the Public Good that provides for collaboration on research to: identify and further develop promising AI research results that have “the potential for broad societal benefits in areas ranging from climate change, natural disasters, health and medicine, electric grid optimization to agriculture.”  While this agreement does not focus on AI legislation, it may provide the EU and U.S. a platform to align the White House Blueprint and AIA.

Regulation of Blockchain

It remains to be seen whether and to what extent these legislative and regulatory AI initiatives at the federal level will encompass technical blockchain issues as well. The primary technical overlap is likely to be in the realm of data privacy and security issues. Until the scope of the new AI laws and regulations are sorted out, one of the more significant authorities that applies specifically to the use of blockchain for ESG is the EU’s General Data Protection Regulation (GDPR). That regulation imposes strict data protection rules that require companies to ensure the privacy and security of personal data stored on blockchains. In the context of blockchain use, compliance with these requirements can be challenging, as the immutability of blockchain data may make it difficult to correct or delete personal data. 

Application of the GDPR to blockchains is of particular importance now that the EU’s Corporate Sustainability Reporting Directive (CSRD) is effective. Companies planning on collecting and managing ESG data to comply with the CSRD disclosure requirements (including those U.S. companies operating in the EU that are subject to CSRD requirements) must take care to comply with the applicable GDPR data protection regulations to avoid substantial legal penalties. The European Securities and Markets Authority (ESMA) has also issued guidelines for companies using blockchain in the financial sector, which include requirements for data protection and privacy. Additionally, the European Data Protection Board (EDPB) has issued guidelines on the use of blockchain and its interaction with GDPR.

Regulatory bodies in several countries are also imposing rules and issuing guidance for the protection of personal information stored on blockchains, including: the Personal Information Protection Commission in Japan; the Personal Data Protection Commission in Singapore; the Information Commissioner's Office in the UK; the Personal Information Protection Act in South Korea; the Office of the Australian Information Commissioner, and; the Office of the Privacy Commissioner of Canada.

At the state level in the U.S., the California Consumer Privacy Act is a privacy law that requires companies doing business in the state to provide California consumers with certain rights regarding their personal information. This includes the right to know what personal information is being collected about them, the right to request deletion of their personal information, and the right to opt-out of the sale of their personal information. The CCPA applies to personal information stored on blockchains.

Planning for Future Regulation

The evolving regulatory environment surrounding AI and blockchain technology highlights the need for companies to stay ahead of the curve. The recent simultaneous surge in generative AI and collapse of blockchain-based cryptocurrencies has led to increased scrutiny of these technologies by regulators around the world; and there will be growing pressure on Congress to formally regulate these technologies, including their use for ESG purposes. 

For companies to successfully incorporate use of AI and blockchain into their ESG practices, they must remain adaptable and responsive to continuing changes in regulations. This requires closely monitoring regulatory developments and implementing strong data protection and privacy measures. By staying informed and taking proactive measures, companies can leverage the benefits of these technologies to enhance their ESG performance and reporting while mitigating potential risks and maintaining regulatory compliance.

For more information, please contact the Barnes & Thornburg attorney with whom you work or Bruce White at 312-214-4584 or bwhite@btlaw.com

© 2023 Barnes & Thornburg LLP. All Rights Reserved. This page, and all information on it, is proprietary and the property of Barnes & Thornburg LLP. It may not be reproduced, in any form, without the express written consent of Barnes & Thornburg LLP.

This Barnes & Thornburg LLP publication should not be construed as legal advice or legal opinion on any specific facts or circumstances. The contents are intended for general informational purposes only, and you are urged to consult your own lawyer on any specific legal questions you may have concerning your situation.

RELATED ARTICLES

Subscribe

Do you want to receive more valuable insights directly in your inbox? Visit our subscription center and let us know what you're interested in learning more about.

View Subscription Center
Trending Connect
We use cookies on this site to enhance your user experience. By clicking any link on this page you are giving your consent for us to use cookies.