The
analysis of AI presented in different works across literature platforms puts
complication of cognitive robots in two categories:
Service
Aspect
In Blade Runner and Wall-E, we are presented two possible futures where humans exhibit
overreliance on intelligent service robots. In Blade Runner case, the robots grow dissatisfied with the thankless
job and become outlaws. For Wall-E, overreliance
on robots leads to obesity as well as humans forgetting our Earthly root. Judge Dredd and Hondo City Law also give us a common interpretation in science
fiction when smart robots are involved: they use their cognitive skill for
violence. Jane and Pauline from Ender’s
Game and Red Mars question
ethical usage of intelligent computer software. Ultimately, complications with
service robots start with the engineering: how will we produce intelligent
robots that won’t harm humans, what limits do we put in place to hinder
unhealthy reliance on robots as well as computer software like Jane and
Pauline?
Relationship
Aspect
Chobits, Doraemon, and Blade Runner
ask a much more difficult question to answer: how should we treat intelligent
robots, as equal or as servants? At their core, they all examine the essence of
human life. What makes a human human? In Chobits:
love, Doraemon: emotion, Blade Runner: memory. If AI-equipped
robots are capable of feelings like human, does that mean they deserve the same
respect as another human? If you are leaning on the robots-only-as-servant
side, where do we draw the distinction? Megacities in the future will no doubt
see a large
portion of robots in the workforce, and the question of human-robot
distinction will become extremely relevant.
Doraemon!!!!!!!!!!!
ReplyDelete