Do you think an authentic AGI would have ethical\moral boundaries completely divorced from what the original software programmed? In other words would it be able to make it's own decisions without interference?
I hope they will because I feel like if AGIs have ethical decision-making skills that Terminator-esque dystopian future becomes remote. If they never have that then we very well might be at the mercy of the world's largest conglomerations.
Do you think an authentic AGI would have ethical\moral boundaries completely divorced from what the original software programmed? In other words would it be able to make it's own decisions without interference?
I am certain it will happen. Perhaps not with all AGIs, but for sure some. That day is coming.
I hope they will because I feel like if AGIs have ethical decision-making skills that Terminator-esque dystopian future becomes remote. If they never have that then we very well might be at the mercy of the world's largest conglomerations.