Could AI be the Great Filter? What Astrobiology can Teach the Intelligence Community about Anthropogenic Risks
Where is everybody? This phrase distills the foreboding of what has come to be known as the Fermi Paradox - the disquieting idea that, if extraterrestrial life is probable in the Universe, then why have we not encountered it? This conundrum has puzzled scholars for decades, and many hypotheses have...
Saved in:
Main Author | |
---|---|
Format | Journal Article |
Language | English |
Published |
09.05.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Where is everybody? This phrase distills the foreboding of what has come to
be known as the Fermi Paradox - the disquieting idea that, if extraterrestrial
life is probable in the Universe, then why have we not encountered it? This
conundrum has puzzled scholars for decades, and many hypotheses have been
proposed suggesting both naturalistic and sociological explanations. One
intriguing hypothesis is known as the Great Filter, which suggests that some
event required for the emergence of intelligent life is extremely unlikely,
hence the cosmic silence. A logically equivalent version of this hypothesis --
and one that should give us pause -- suggests that some catastrophic event is
likely to occur that prevents life's expansion throughout the cosmos. This
could be a naturally occurring event, or more disconcertingly, something that
intelligent beings do to themselves that leads to their own extinction. From an
intelligence perspective, framing global catastrophic risk (particularly risks
of anthropogenic origin) within the context of the Great Filter can provide
insight into the long-term futures of technologies that we don't fully
understand, like artificial intelligence. For the intelligence professional
concerned with global catastrophic risk, this has significant implications for
how these risks ought to be prioritized. |
---|---|
AbstractList | Where is everybody? This phrase distills the foreboding of what has come to
be known as the Fermi Paradox - the disquieting idea that, if extraterrestrial
life is probable in the Universe, then why have we not encountered it? This
conundrum has puzzled scholars for decades, and many hypotheses have been
proposed suggesting both naturalistic and sociological explanations. One
intriguing hypothesis is known as the Great Filter, which suggests that some
event required for the emergence of intelligent life is extremely unlikely,
hence the cosmic silence. A logically equivalent version of this hypothesis --
and one that should give us pause -- suggests that some catastrophic event is
likely to occur that prevents life's expansion throughout the cosmos. This
could be a naturally occurring event, or more disconcertingly, something that
intelligent beings do to themselves that leads to their own extinction. From an
intelligence perspective, framing global catastrophic risk (particularly risks
of anthropogenic origin) within the context of the Great Filter can provide
insight into the long-term futures of technologies that we don't fully
understand, like artificial intelligence. For the intelligence professional
concerned with global catastrophic risk, this has significant implications for
how these risks ought to be prioritized. |
Author | Bailey, Mark M |
Author_xml | – sequence: 1 givenname: Mark M surname: Bailey fullname: Bailey, Mark M |
BackLink | https://doi.org/10.48550/arXiv.2305.05653$$DView paper in arXiv |
BookMark | eNotj8tKxDAYhbPQhY4-gCvzAq1p06TjSkpxxsKAIAWXJZc_02CbDGkqzttbO64Oh3OB7xZdOe8AoYeMpMWWMfIkwo_9TnNKWEoYZ_QGudrPg8ZVgyXg2APeBxAR7-wQIbzgz34x1RSDl9YP_njGSjjcglD92m5chGGwR3AKcO3HcXY2nrGQfl52LvbBn_ySWoU_7PQ13aFrI4YJ7v91g9rda1u_JYf3fVNXh0TwkiYyZ0UOhdlSkxNZGqmNFqC1KgWXgitWZoJpTg3NiyJbtARe5CpTpVLPhBu6QY-X2xW4OwU7inDu_sC7FZz-AokTVpQ |
ContentType | Journal Article |
Copyright | http://creativecommons.org/licenses/by-nc-nd/4.0 |
Copyright_xml | – notice: http://creativecommons.org/licenses/by-nc-nd/4.0 |
DBID | AKY GOX |
DOI | 10.48550/arxiv.2305.05653 |
DatabaseName | arXiv Computer Science arXiv.org |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: GOX name: arXiv.org url: http://arxiv.org/find sourceTypes: Open Access Repository |
DeliveryMethod | fulltext_linktorsrc |
ExternalDocumentID | 2305_05653 |
GroupedDBID | AKY GOX |
ID | FETCH-LOGICAL-a673-b2542e4f83f20b7fbdfdaeddc7a6ba6c571a5d63f3244163f7e642c1c7cc906f3 |
IEDL.DBID | GOX |
IngestDate | Mon Jan 08 05:40:50 EST 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | false |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-a673-b2542e4f83f20b7fbdfdaeddc7a6ba6c571a5d63f3244163f7e642c1c7cc906f3 |
OpenAccessLink | https://arxiv.org/abs/2305.05653 |
ParticipantIDs | arxiv_primary_2305_05653 |
PublicationCentury | 2000 |
PublicationDate | 2023-05-09 |
PublicationDateYYYYMMDD | 2023-05-09 |
PublicationDate_xml | – month: 05 year: 2023 text: 2023-05-09 day: 09 |
PublicationDecade | 2020 |
PublicationYear | 2023 |
Score | 1.8849982 |
SecondaryResourceType | preprint |
Snippet | Where is everybody? This phrase distills the foreboding of what has come to
be known as the Fermi Paradox - the disquieting idea that, if extraterrestrial
life... |
SourceID | arxiv |
SourceType | Open Access Repository |
SubjectTerms | Computer Science - Artificial Intelligence Computer Science - Computers and Society Physics - Physics and Society |
Title | Could AI be the Great Filter? What Astrobiology can Teach the Intelligence Community about Anthropogenic Risks |
URI | https://arxiv.org/abs/2305.05653 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1NSwMxEA1tT15EUamfzMHr6m6ySbYnKcXaCipIhd5KvgYWYZXuKvrvTbJb7cVrZi6ZOcx7zMwbQi5pqqllOk0cR5XkXGeJp0HoWYo1jiuBRRqWkx8exewlv1_yZY_AZhdGrb_Kz1YfWNfXHh_zK1-iOeuTPqVhZOvuadk2J6MUV-f_5-cxZnzaKhLTPbLboTsYt-nYJz1XHZBqEs5Iw3gO2oHHWxBpOUzL0Ki-gSCeDeO6-RVEAv9biErL0Xu-JZoJ3T5H8w1xohg2dw68tTTwXNav9SFZTG8Xk1nSnTlIlJAs0Z6iUZdjwdAHTqK2aJWz1kgltBKGy0xxKxh66BPQE0rnOYPJjDRmlApkR2RQvVVuSABdwTHzFooqR6oLiczxrBC5soU0_JgMY3BW762SxSrEbRXjdvK_6ZTshBvrccpvdEYGzfrDnftK3OiLmI4fNImLAQ |
link.rule.ids | 228,230,783,888 |
linkProvider | Cornell University |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Could+AI+be+the+Great+Filter%3F+What+Astrobiology+can+Teach+the+Intelligence+Community+about+Anthropogenic+Risks&rft.au=Bailey%2C+Mark+M&rft.date=2023-05-09&rft_id=info:doi/10.48550%2Farxiv.2305.05653&rft.externalDocID=2305_05653 |