Tuesday , November 5 2024
Home / NEWS ANALYSIS / New laws of robotics needed to tackle AI: expert

New laws of robotics needed to tackle AI: expert

 

FILE PHOTO: Robot brain chip

Vatican City, Holy See | AFP | Decades after Isaac Asimov first wrote his laws for robots, their ever-expanding role in our lives requires a radical new set of rules, legal and AI expert Frank Pasquale warned on Thursday.

The world has changed since sci-fi author Asimov in 1942 wrote his three rules for robots, including that they should never harm humans, and today’s omnipresent computers and algorithms demand up-to-date measures.

According to Pasquale, author of “The Black Box Society: The Secret Algorithms Behind Money and Information”, four new legally-inspired rules should be applied to robots and AI in our daily lives.

“The first is that robots should complement rather than substitute for professionals” Pasquale told AFP on the sidelines of a robotics conference at the Vatican’s Pontifical Academy of Sciences.

“Rather than having a robot doctor, you should hope that you have a doctor who really understands how AI works and gets really good advice from AI, but ultimately it’s a doctor’s decision to decide what to do and what not to do.”

“The second is that we need to stop robotic arms races. There’s a lot of people right now who are investing in war robots, military robots, policing robots.”

– No humanoids –

Pasquale, a law lecturer at the University of Maryland, says it’s important that any investment in military robotics or AI should provide some advantage that’s “not going to be just immediately cancelled out by your enemies.”

“It’s just depressing, it’s money down a hole, you build a robot that can tell if my robot can tell if your robot can tell if my robot is about to attack. It just never ends.”

The third, and most controversial, rule is not to make humanoid robots or AI, says Pasquale, citing the example of a Google assistant called Duplex that would call people to confirm hotel reservations without telling them they were talking to a computer.

“There was an immediate backlash to that because people thought that it was Google trying to pass its machines off as a human and I think that counterfeiting principle is the most important one, that we should not counterfeit humanity.”

Robots can look humanoid “only if it’s totally necessary to the task,” said Pasquale, such as “care robots or sex robots.”

The fourth and final law is that any robot or AI should be “attributable to or owned by a person or a corporation made of persons because… we know how to punish people but we don’t know how to punish machines.”

– Two-tier tech –

“If we have drones flying about the place that are autonomous, cars that are autonomous, bots online that are speaking on Twitter or in finance trading stocks, all of those have to be attributed to a person.”

“There are companies like Facebook where they just fine the company but they never force any person to give up money or be incarcerated for doing something wrong. And that’s a problem.

“We’ve already got a problem with corporations and if we allow robots to be out there without being attributable to a person or corporation it just gets worse,” said Pasquale, whose book “New Laws of Robotics” is due to be published by Harvard University Press.

What must be avoided at all costs is two-tier technology, such as proposed by Boeing for their 737-MAX airliners, grounded after hundreds died in two crashes blamed on a sensor failing because of a software error.

“They made a decision at a very high level that they would allow airlines to buy a second sensor but that would be an extra cost, I think it was 80,000 dollars extra per plane, and that to me foreshadows a ton of problems.”

“Part of what the law has to do is step in and say certain of these bargains we’re not going to allow, we’re going to say we have a standard for flying, it’s a single standard it’s not like oh I need to look up whether Ryanair, American Airlines or Lufthansa bought the extra sensor, because that’s madness.”

Leave a Reply

Your email address will not be published. Required fields are marked *