A study of innovations in legal governance with respect to the safety of artificial intelligence

This paper aims to promote the safe development of artificial intelligence and improve legal policies. Combined with the cluster analysis algorithm, it analyzes the safety risks as well as legal defects of artificial intelligence. The Laplace matrix is derived using the similarity matrix, and the fe...

Full description

Saved in:
Bibliographic Details
Published inApplied mathematics and nonlinear sciences Vol. 9; no. 1
Main Author Li, Yanggui
Format Journal Article
LanguageEnglish
Published Beirut Sciendo 01.01.2024
De Gruyter Poland
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper aims to promote the safe development of artificial intelligence and improve legal policies. Combined with the cluster analysis algorithm, it analyzes the safety risks as well as legal defects of artificial intelligence. The Laplace matrix is derived using the similarity matrix, and the feature vector space is constructed by analyzing the associated features of artificial intelligence safety. Combining the spectral clustering algorithm, legal assessment indexes for artificial intelligence safety were constructed. The modular metric value method is utilized to assess the clustering effect of laws on the safety of artificial intelligence. Analyzing the security risks of artificial intelligence, improved legal policies are proposed from the perspective of technology and privacy. The results show that the effect of improving privacy protection policy on privacy protection is 0.85, and the effect of clarifying subject rights is 0.9. The introduction of laws should consider social ethics, and the effect degree of ethical principles is 0.75. Clarifying subject rights can help avoid technological risks to a certain extent, and improving privacy protection policies can help protect users’ privacy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2444-8656
2444-8656
DOI:10.2478/amns.2023.2.01300