Author Jon Ward details how he left his evangelical church, his growing alarm over how some Christian conservatives have attacked truth, and what it would take for White evangelicals to abandon Donald Trump.
LLMs can currently be convinced that God exists, so I wouldn't put too much weight on that.
When you can ignore 1000 detractions and cling to a single flimsy confirmation of your bias, the problem isn't the quality of the information available and how it's presented, it's the quality of the person and their willingness to reason.
I really don't understand how people watching a technology that goes from experts saying "XYZ will be impossible for the technology" in 2019 and then having that very thing happen only three years later in 2022 with researchers having a surprised Pikachu face so regularly refer to it as if the present state of the tech is going to remain a status quo for the foreseeable future.
You do realize it's going to continue to improve at a ridiculous rate, right?
Even the version that exists today has managed to have over a 100% increase in its performance on various evaluations simply by researchers better learning how to use it over the past year. And we're likely getting a new leap forward in the models themselves next year.
I wouldn't be so quick to ignore thousands of indicators the technology is advancing to stick with the notion the tech isn't going to make a difference based on its present limitations.
I didn't say the tech won't improve, I said the basis of the point being made - to educate the religious away from religion - won't occur. That has everything to do with challenging philosophical conjectures or unproven belief systems, and nothing to do with ChatGPT's ability to produce content that could be convincing or build upon it given more input.
Part of my work involves using and integrating ChatGPT with other systems. I see the the evolution right in front of my every day, and it doesn't make a damn bit of difference to my point.
I see the the evolution right in front of my every day, and it doesn't make a damn bit of difference to my point.
It's often hard to see the forest when you are focused on the trees.
If you read the article, look at what changed his mind, look at other deconversion stories, and look at where the tech is going in ~3 years, then I guess I just really don't see it the same way as you, as to my eye it will make quite a bit of damn difference when someone beginning to question can have the heavy lifting of self-education significantly reduced.
If you're willing to extrapolate the experience of one believer and their opinion, then yes, but that's not enough for me.
And treating my experience working directly with said technology as narrower than it is, is your prerogative, but I don't know why you'd expect me to take your opinion seriously after doing so.
LLMs can currently be convinced that God exists, so I wouldn't put too much weight on that.
When you can ignore 1000 detractions and cling to a single flimsy confirmation of your bias, the problem isn't the quality of the information available and how it's presented, it's the quality of the person and their willingness to reason.
I really don't understand how people watching a technology that goes from experts saying "XYZ will be impossible for the technology" in 2019 and then having that very thing happen only three years later in 2022 with researchers having a surprised Pikachu face so regularly refer to it as if the present state of the tech is going to remain a status quo for the foreseeable future.
You do realize it's going to continue to improve at a ridiculous rate, right?
Even the version that exists today has managed to have over a 100% increase in its performance on various evaluations simply by researchers better learning how to use it over the past year. And we're likely getting a new leap forward in the models themselves next year.
I wouldn't be so quick to ignore thousands of indicators the technology is advancing to stick with the notion the tech isn't going to make a difference based on its present limitations.
I didn't say the tech won't improve, I said the basis of the point being made - to educate the religious away from religion - won't occur. That has everything to do with challenging philosophical conjectures or unproven belief systems, and nothing to do with ChatGPT's ability to produce content that could be convincing or build upon it given more input.
Part of my work involves using and integrating ChatGPT with other systems. I see the the evolution right in front of my every day, and it doesn't make a damn bit of difference to my point.
It's often hard to see the forest when you are focused on the trees.
If you read the article, look at what changed his mind, look at other deconversion stories, and look at where the tech is going in ~3 years, then I guess I just really don't see it the same way as you, as to my eye it will make quite a bit of damn difference when someone beginning to question can have the heavy lifting of self-education significantly reduced.
If you're willing to extrapolate the experience of one believer and their opinion, then yes, but that's not enough for me.
And treating my experience working directly with said technology as narrower than it is, is your prerogative, but I don't know why you'd expect me to take your opinion seriously after doing so.