Welcome to our first edition of Regulatory Roundup—a special blog series highlighting some of the latest and significant regulatory news and insights in the AML compliance space. We’ll delve into the major themes that have emerged recently, from compliance culture to cryptocurrency, from new sanctions and enforcement efforts to artificial intelligence (AI) regulations.
The Financial Crimes Enforcement Network (FinCEN) issued a final rule regarding access to BOI. This rule implements provisions from the Corporate Transparency Act (CTA) that allow certain entities to access identifying information associated with reporting companies, their beneficial owners, and company applicants.
The Council and Parliament of the European Union have agreed to establish a new authority called the Anti-Money Laundering Authority (AMLA). AMLA will have direct and indirect supervisory powers over high-risk financial entities within the financial sector.
The European Banking Authority (EBA) has launched a public consultation on new guidelines to prevent the abuse of funds and certain crypto-assets transfers for money laundering and terrorist financing purposes. These guidelines, often called the “travel rule,” specify the steps various financial service providers should take to detect missing or incomplete information in funds or crypto-assets transfers.
The US and the UK have taken coordinated action against Hamas leaders and financiers. The move includes imposing sanctions on notable Hamas officials and the mechanisms by which Iran supports Hamas and Palestinian Islamic Jihad (PIJ). These actions are designed to protect the international financial system from misuse by Hamas and its enablers.
The UK government provided guidance on sanctions ownership and control. The document specifies circumstances under which an entity is considered owned or controlled by another person.
Members of the European Parliament (MEPs) are urging the EU and its member states to strengthen and centralize oversight of how sanctions are implemented. They also call for improved coordination in enforcing existing sanctions on Russian oil exports and imposing sanctions on major Russian oil companies and their subsidiaries.
Executive Order 14114, issued on December 22, 2023, outlines additional measures the US took in response to the Russian Federation’s harmful activities—underscoring the US government’s commitment to addressing concerns related to Russia and its military-industrial complex. Under this order, foreign financial institutions engaging in significant transactions or providing services related to Russia’s military-industrial base are at risk of sanctions from the Office of Foreign Assets Control (OFAC).
OFAC has reached a $466,200 settlement agreement with property and casualty insurer PURE for apparent violations of the Ukraine/Russia-related sanctions regulations. The settlement relates to four insurance policies, two under an individual’s name and two under a company owned by the same individual, all of which were initially onboarded without any sanctions.
The UK Information Commissioner has warned that 2024 may be the year when people lose trust in AI if privacy concerns are not adequately addressed. The commissioner emphasized the importance of integrating privacy measures into AI products and services right from the beginning.
Canadian privacy regulators have introduced a set of principles aimed at promoting the responsible development and use of generative AI. These principles cover various aspects of AI use, including legal authority and consent, ensuring that AI applications serve appropriate purposes, adhering to necessity and proportionality, maintaining openness and accountability, providing individual access, limiting collection, use, and disclosure of data, ensuring accuracy, and implementing safeguards.
The EU has agreed to implement the Artificial Intelligence Act, establishing comprehensive regulations for trustworthy AI systems. The Act encompasses various vital components, including a list of banned AI applications, exemptions for law enforcement purposes, obligations for high-risk AI systems, guardrails for general AI systems, measures to foster innovation and support small and medium-sized enterprises (SMEs), as well as provisions for sanctions and the Act’s entry into force.
In response to an executive order, the National Institute of Standards and Technology (NIST) has outlined a series of initiatives aimed to manage risks associated with artificial intelligence (AI).
The Bank of England (including the Prudential Regulation Authority (PRA)), and the Financial Conduct Authority (FCA) have received feedback suggesting that a regulatory definition of AI might not be helpful due to the rapid evolution of AI capabilities. Instead, regulators are encouraged to maintain “live” regulatory guidance.
The European Parliament and the Council of the European Union are currently deliberating the European Commission’s proposed Artificial Intelligence Act, which aims to ensure the proper functioning of the EU marketplace by creating conditions for developing and using trustworthy AI systems.
We’ll continue to monitor regulatory developments and provide updates in future editions of Regulatory Roundup. In the meantime, reach out to us if you need help navigating the evolving landscape of regulations and enforcement actions.
FinScan offers Advisory Services to assist with model risk management, data governance, policies and procedures, and assessing your organization’s data quality, sanctions compliance program, customer and compliance risk, and AI framework, to name a few. Contact us to learn more.