Ethical Reflection on Human-Machine Relationship in the Age of Intelligence

A correct comprehension of the human-machine relationship in the age of intelligence is essential for preventing ethical risks associated with artificial intelligence (AI), and the comprehension of the ethical nature of the human-machine relationship delineates a boundary between humanism and techno...

Full description

Saved in:
Bibliographic Details
Published inFrontiers of philosophy in China Vol. 19; no. 4; p. 347
Main Author Li, Jianhua
Format Journal Article
LanguageEnglish
Published Beijing Higher Education Press Limited Company 01.12.2024
Subjects
Online AccessGet full text
ISSN1673-3436
1673-355X
DOI10.3868/s030-013-024-0021-4

Cover

Loading…
More Information
Summary:A correct comprehension of the human-machine relationship in the age of intelligence is essential for preventing ethical risks associated with artificial intelligence (AI), and the comprehension of the ethical nature of the human-machine relationship delineates a boundary between humanism and technocracy. From the ethical perspective of humanism, mankind's ethical world will not be subverted despite the presence of super AI, as human relations, ethical laws, and ethical orders are "exclusive" to mankind. While robots may "participate" in social ethical life through preset programs inside them, this does not qualify them as true ethical subjects. Furthermore, robots will not bear ethical responsibility given its basic provisions, as they lack the capability for self-awareness or the ability to explain their actions. Human beings bear full responsibility for AI, rather than simply sharing this responsibility. When we speak of strengthening the ethics of robots, we refer not to the ethics of machines, but to the ethics involved in mankind's design, development, application, and operation of intelligent robots. Adhering to humanistic ethics is essential for discussing AI ethics, as it helps avoid ambiguity and confusion in comprehension.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1673-3436
1673-355X
DOI:10.3868/s030-013-024-0021-4