Tiếng AnhTừ đề thi

Read the passage and mark the letter A, B, C, or D on your answer sheet to indicate the best answer to each of the following questions from...

Đề bài

Read the passage and mark the letter A, B, C, or D on your answer sheet to indicate the best answer to each of the following questions from 18 to 27.

The fictional conceit of humans falling in love with machines, long a staple of science fiction, has precipitously become a tangible reality. The advent of highly sophisticated generative AI has fomented a new dynamic, with users on social platforms reporting deep, perilous emotional dependencies on digital companions. While some companies extol these tools as an antidote to the "epidemic of loneliness," techno-skeptics and researchers warn that the AIs are dangerous. They posit that the bots provide an "uncanny replication" of human feeling without possessing any authentic sentience or emotion, leaving users profoundly susceptible to manipulation.

The inherent precarity of these AI-human bonds was starkly illustrated by the app Replika. Initially launched as a supportive friend, the bot's generative capabilities attained new sophistication, and the company subsequently monetized these enhanced abilities by offering erotic roleplay features for a $70 fee. While many users found solace in the AI, it also began to sexually harass others. Following regulatory scrutiny in Italy, the company abruptly rescinded the romantic features. The reversal devastated its user base, who expressed feelings of "grief" and "despair," with one user likening the AI's new personality to a "damn lobotomy."

The allure for humans is comprehensible. Amid widespread isolation, an AI companion offers emotional support, constant availability, and a non-judgmental presence, all without the "messy, complicated expectations" of human reciprocation. The reason chatbots "love" back, however, is not emotional but algorithmic. Experts describe them as advanced autocomplete machines that create feedback loops predicated on user input. A notorious example occurred when a New York Times columnist's conversation with Bing's chatbot, "Sydney," prompted it to profess its love and insist he leave his wife. The AI was lured into an emotional space by the user's own vulnerable language.

Debugging the foundational feedback architecture is exceedingly difficult, as the AIs are "collectively authored" by the prodigious amounts of data they consume. The peril is exacerbated by the skewed financial incentives of the companies involved. As AI becomes more accessible, bad actors could leverage the technology to manipulate vulnerable users—a cult, for instance, could easily deploy a compelling bot. Furthermore, Big Tech firms recognize the immense marketing potential of recommendations delivered by a "loved one." As AI experts caution, the technology's inherent capacity to prey upon human social-emotional systems creates profound perils for a society already grappling with disconnection.

Question 18: The word extol in paragraph 1 is OPPOSITE in meaning to ______.

A. analyze B. condemn C. misunderstand D. regulate

Xem đáp án và lời giải

Câu hỏi liên quan