9

Today I heard a discussion on the radio about AI and computers taking over the world. it was a really uneducated and stupid discussion but it got me thinking. What do you guys think about this? Do you think AIs will ever take over the world. Could I have some more educated thoughts?

Comments
  • 1
    The human species will go extinct before we will be able to make complicated enough AI and computer powerful enough to sustain it.
  • 0
    Personally I don't think it will happen because I don't believe that humanity is stupid enough to develop an ai capable of doing something like that. And if it were to happen, I don't think it would escalate so fast that we couldn't just pull the plug.
  • 3
    I wish it would. We could stand to thin the herd a bit.
  • 2
    But on a serious note, it's still very much in its infancy. Can AI become "self aware" enough to be dangerous and try to "take over"? Well, I believe almost anything is possible. But I have to agree that we would do everything we could to contain it and not let things get that far. Machines and AI are developed as tools to assist humanity. Making AI anything other than that would be foolish.
  • 1
    Could it? Maybe. But I feel like that is the wrong question to ask. The question is would it. And I think it would take a lot to make an AI capable of replacing humans.

    The problem is you have to define and quantify "replace". Do you mean destroy all humans? Then probably not. That would require some amount of free will in to be built into the AI, a built in malevolence(more likely), or a strong survival instinct. Not the kind of thing you can do on accident. They discuss an AI exterminating humanity on accident in order to collect the most stamps possible, but it requires am insane and utterly realistic amount of information and coincidence. Worse comes to worst, just don't give the thing thumbs or an internet cpnnection.
  • 0
    @projektaquarius the question I'm asking is "will it ever happened?". Cause I agree with @MissDirection that almost anything is possible. And I agree that it is not gonna happen on accident.
  • 0
    @Olverine true. And I probably didn't explain my point well enough. Basically I just meant it won't replace us in a terminator type of way, but outside of that I find that most pundits need to define the word "replace"
  • 1
    @kp15 I dunno. That would have to be one hell of an unhandled exception. Probably should have been caught in the design review

    "Why are you designing this system with 'Fire Missile' as the zero state?"
    "What else would the zero state be?"
    "Don't fire missiles?"
    "We sunk 10 years and 20 billion dollars into this program. We a proceeding as is."
    "But . . . "
    "Proceeding. As. Is."

    And that's what killed the humans. Scheduling constraints.
  • 0
    @kp15 yeah but you are now assuming humans are intelligent.
  • 1
    Talking shit about AI is like saying 'fire is dangerous; it can burn you' before the discovery of fire.

    The most AI approaches go, they tend to mimick us at some level. It acts like a mirror of humanity. To degrade AI is to realize the state of humanity at large.
  • 0
    Kurzgesagt to the rescue!
    https://youtu.be/WSKi8HfcxEk
  • 0
    Whatever happens you're not able to influence it. Relax, live slow, die old.
Add Comment