AI among the most alarming risks for 2024

On January 10, the World Economic Forum (“WEF”) released its “Global Risks Report 2024", in which it assesses that the use of artificial intelligence and its results, especially with regard to false information and disinformation, will be one of the biggest risks this year, along with cyberattacks, climate problems, political polarization and the housing price crisis.

According to the organization, when facing the challenges posed by the rapid advancement of AI and technology, a series of problems and innovative solutions are posed. The FEM also points out that”the risk of market concentration and its potential impact on national security incentives require a concerted effort to improve global governance structures”.

The FEM also stresses that there is a need for a global approach that regulates and defines harmonious parameters for the development and use of this type of technology, referring to both civil and military applications.

Sensitive to the concern expressed by the FEM, it is possible to identify a regulatory movement in different countries to respond to the risk associated with the use of AI at different levels. In the regulatory scenario, some countries have approved regulations to establish limits and criteria for the protection of human rights, such as China and Canada. On the other hand, negotiations are taking place for the approval of standards of this nature in Europe (European AI Act), the United States, India and Brazil.

Finally, risk assessment frameworks and specific guidelines on the use of AI have been developed in many countries, such as the United Kingdom (which recently opened a call for public consultation on protective measures related to artificial intelligence), Japan and the United Arab Emirates.

Thus, it is important for companies to be aware of the risks associated with AI, to follow up on regulatory updates and to apply governance measures to ensure the proper use of this important technology.

Read more

Related posts that might interest you

All our news