Page 21 of 82 FirstFirst ... 1116171819202122232425263141 ... LastLast
Results 201 to 210 of 814

Thread: Random Stuff

  1. #201
    Join Date
    27th Dec 2007
    Location
    Sydney NSW
    Posts
    37,730

    Default

    The moral dilemma of robot cars

    Here's the thing; robot cars are safer than human operated cars. They are cleaner, more fuel efficient, never get road rage or tired etc. Road injuries and fatalities are likely to be drastically reduced with more robot cars on the road. But what happens when a fatal emergency is about to occur? Should robot cars be programmed to act upon the idea of "the needs of the many outweigh the needs of the few"? Or otherwise make 'logical' life saving decisions which would mean sacrificing other lives.

    This is a question which science fiction has long debated, particularly among science fiction writers, including Isaac Asimov who of course created the 3 Laws of Robotics:
    1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
    3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws

    But of course, there may be situations where the robot cannot save every human around it. What then? The film I, Robot touched upon this question with the flashback scene where Spooner (Smith) is involved in a car crash, and a nearby robot jumps into the water and saves him instead of a young girl, despite Spooner ordering the robot to sacrifice him and save the girl. The robot only calculated likelihood of survival and determined that it only had an 11% chance of success if it attempted to save the girl vs. a much higher chance of success if it tried to save Spooner; and its calculations proved correct as Spooner was saved, but the girl drowned. But if the robot had attempted the save the girl instead, there would've been a higher probability that both parties would have perished, thus violating the First Law of Robotics.

    So what do you guys think? Should robot cars be programmed to consider the needs of the many over the needs of the few, even if it means sacrificing its owner/occupants, or should they be programmed to protect their owners/occupants, even if it means sacrificing a greater number of people outside the vehicle?

  2. #202
    Join Date
    7th Mar 2012
    Location
    The Moon
    Posts
    6,605

    Default

    Quote Originally Posted by GoktimusPrime View Post
    The moral dilemma of robot cars

    Here's the thing; robot cars are safer than human operated cars. They are cleaner, more fuel efficient, never get road rage or tired etc. Road injuries and fatalities are likely to be drastically reduced with more robot cars on the road. But what happens when a fatal emergency is about to occur? Should robot cars be programmed to act upon the idea of "the needs of the many outweigh the needs of the few"? Or otherwise make 'logical' life saving decisions which would mean sacrificing other lives.

    This is a question which science fiction has long debated, particularly among science fiction writers, including Isaac Asimov who of course created the 3 Laws of Robotics:
    1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
    3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws

    But of course, there may be situations where the robot cannot save every human around it. What then? The film I, Robot touched upon this question with the flashback scene where Spooner (Smith) is involved in a car crash, and a nearby robot jumps into the water and saves him instead of a young girl, despite Spooner ordering the robot to sacrifice him and save the girl. The robot only calculated likelihood of survival and determined that it only had an 11% chance of success if it attempted to save the girl vs. a much higher chance of success if it tried to save Spooner; and its calculations proved correct as Spooner was saved, but the girl drowned. But if the robot had attempted the save the girl instead, there would've been a higher probability that both parties would have perished, thus violating the First Law of Robotics.

    So what do you guys think? Should robot cars be programmed to consider the needs of the many over the needs of the few, even if it means sacrificing its owner/occupants, or should they be programmed to protect their owners/occupants, even if it means sacrificing a greater number of people outside the vehicle?
    it's a poor question. Who are the few? Who are the many? Are you referring to the occupant/s of the vehicle as the few? What if they are the many? What if both parties are equal in number?

    It's too complicated a thing to program. The car, just like the autopilot on a commercial airliner, should be programmed to protect the occupants. It would naturally do so by obeying the road rules and scanning for hazards to avoid safely.

    I can't say I'd be at ease with a car that was programmed to, in the event that 2 people (unintentionally) disobey the road rules and step out in front of me, potentially sacrifice me to save those 2 people. What if to avoid those 2 people the car made a decision to plow into the lounge room of a house and killed the single occupant? What if it turns out that there was a children's birthday party in that lounge room and your car just made a pre-programmed decision that resulted in the death of 6 children, all to save 2 people that were at fault?


    Imagine the law suits.
    Dovie'andi se tovya sagain

  3. #203
    Join Date
    21st Mar 2014
    Location
    Queensland
    Posts
    1,010

    Default

    Quote Originally Posted by Trent View Post
    I can't say I'd be at ease with a car that was programmed to, in the event that 2 people (unintentionally) disobey the road rules and step out in front of me, potentially sacrifice me to save those 2 people. What if to avoid those 2 people the car made a decision to plow into the lounge room of a house and killed the single occupant? What if it turns out that there was a children's birthday party in that lounge room and your car just made a pre-programmed decision that resulted in the death of 6 children, all to save 2 people that were at fault?


    Imagine the law suits.
    Not that I disagree with you, (I think the car should prioritise the safety of the occupant before anyone outside - I'm not against sacrificing my safety depending on the situation, but I would like that decision to be left entirely up to me, not a computer) but if you were driving by yourself and two people stepped out in front of you, wouldn't you instinctively attempt to swerve to avoid them?

    To be honest, I don't like the idea of programmed robot cars. Every piece of technology up until now (including cars) has been hackable by someone at sometime. I think it naive to think any type of robotic car in the future would be hacker proof.

    Not to mention potential software conflicts/ bugs, etc. In a future that is heading towards a majority wireless society, how will people control the software/firmware updates that the robot car will no doubt require? Look at Microsoft as an example. They roll out Windows patches and updates very often. And sometimes a particular update will cripple or severley damage a Windows install (granted it is rare, but it happens.) And with companies being more and more sneaky about updates these days, I think there is a high probability that such updates would just get transmitted wirelessly as you are driving - without any knowledge to the driver.

    And I'm sure robot cars will need some sort of GPS connectivity to be able to drive. And again, we all know how unreliable GPS can be sometimes.

    Maybe I've seen too many SF movies and anime ( ) but those are my main concerns about robot cars in the future. And maybe I'm in a minority, but I actually like driving my car with my own two hands.

  4. #204
    Megatran Guest

    Default

    Wow. Just heard George Street Sydney CBD will be closed off to vehicle traffic for 3 years due to light rail work.

    I remember some years back visiting Sydney & heading to the Rocks along George street in a taxi. Ended up walking part of the way. Faster too.

  5. #205
    Join Date
    7th Feb 2013
    Location
    2164
    Posts
    8,925

    Default

    Quote Originally Posted by GoktimusPrime View Post
    So what do you guys think? Should robot cars be programmed to consider the needs of the many over the needs of the few, even if it means sacrificing its owner/occupants, or should they be programmed to protect their owners/occupants, even if it means sacrificing a greater number of people outside the vehicle?
    Neither. There are too many variables to even consider such notions and attempt to implement.

  6. #206
    Join Date
    23rd Sep 2010
    Location
    Sydney
    Posts
    9,352

    Default

    Can you imagine the state of the car industry if cars were programmed to prioritise the safety of others over the occupants? there would suddenly be a lot of push bikes on the road.
    My Fan interview with Big Trev

    my original collection from when I was more impressionable.
    My Current Collection Pics (Changing on occasion)

  7. #207
    Join Date
    5th May 2008
    Location
    Clifton Hill, Melbourne
    Posts
    4,272

    Default

    Quote Originally Posted by Megatran View Post
    Wow. Just heard George Street Sydney CBD will be closed off to vehicle traffic for 3 years due to light rail work.
    I think they have similar plans for Swanston St in Melbourne if start work on the train tunnel under the city.
    |Buy ALL my things!|Collection Thread|Current Collection Count: ~661|
    |Wants|Galaxy Force Blue Rumble|

  8. #208
    Join Date
    7th Apr 2010
    Location
    BRAYBROOK
    Posts
    2,778

    Default

    Quote Originally Posted by Golden Phoenix View Post
    I think they have similar plans for Swanston St in Melbourne if start work on the train tunnel under the city.
    Not as bad now as they will now tunnel with a burrowing machine and the tunnel itself will be deeper.
    WANTED BOTS: G1: Horri-bull, Snarler, Mainframe, Chop Shop, Ransack CHUG: Spin Out, Cordon, Brotropolis Rescue MASTERPIECE: Acid Storm
    ENERGON: Six Shot

  9. #209
    Join Date
    3rd Sep 2014
    Location
    North-west Sydney
    Posts
    2,044

    Default

    The other day the talented Diane Charlemagne passed away. For those who don't know her she is a singer who sang on many tunes including a bunch of Drum & Bass tunes from artists like High Contrast, Netsky, S.P.Y., Cyantific, Taxman and Goldie.

    RIP Diane you'll be missed
    I'll update this when I'm needing help finding particular figures

  10. #210
    Join Date
    16th Mar 2015
    Location
    Young
    Posts
    1,693

    Default

    At the going down of the sun,
    And in the morning,
    We will remember them.

    Lest we forget
    "Save the rebellion! Save the dream!" - Saw Gerrera


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •