Singularity – reddit

My grandfather is an engineer and really loves science, technology, etc. The only thing I really know that he fears is singularity. He has explained that he believes robots will be programmed to become increasingly smart, and as they do that they will rapidly because more intelligent than humans and will soon after dominate or eradicate the human race. His explanation usually involves numbers so hell talk about if AI is programmed to become more intelligent at, say, 2xs the rate of humans then it will be a very rapid change.

Stopping there, one question I have is if it is possible/will be possible to program AI to become more intelligent? Or do we merely have to program them to a certain intelligence level where they will remain from then on?

My concerns about AI, with regard to singularity, are 1. if they are somehow connected to the Internet to obtain information and use all the available information to improve themselves and advance themselves to where they could dominate humans and 2. if AI used in the military goes rogue.

Is one, all, or none of these arguments more reasonable or likely to occur? Should we be truly concerned about singularity?

Thank you for your time and input! Im sorry if my questions are meaningless - I dont know much about AI at all but the idea of singularity intrigues me.

Original post:

Singularity - reddit

Related Posts

Comments are closed.