I really don’t think a random D&D table is the place to learn to express empathy. I really wish people would stop acting like local D&D groups are a good way to learn how to socialize in general. I’m not saying you can’t learn things at the table, but the games are not actual reflections of reality and there’s a lot of go along to get along, or just run of the mill toxic group dynamics. The hobby overall can be hard for other minorities to enter, and having a table with someone still learning social skills (especially how to express empathy) and someone from a marginalized group can lead to unfortunate outcomes that your standard DM/group do not have the ability to address. It can lead one or both parties to have negative experiences that reinforce the idea they are unwelcome and leave the rest of the table with negative experiences of playing with ND people or minorities.
Sometimes practicing first with people trained to do this is the best step, and second to that would be practicing empathy in a space where the main goal is bonding rather than another nebulous goal of having fun playing a game. I don’t know if AI is the answer, but trusting your local DM/table to be able to teach empathy is a big ask. It’s almost insulting to the people that teach this and to people with ASD. Teaching empathy can’t be as passive as it is for non-ASD people, and acting like it’s just something they are expected to pick up while also dealing with all these other elements makes it seems like you don’t think it’s something they actually have to work to achieve. I’m not on the spectrum but I have a lot of autistic friends and I would not put just any of them in a D&D situation and expect them and the rest of the table to figure it out.
Also, generally comparing to an unaffected control is the gold standard. They did what is generally needed to show their approach has some kind of effect.
I’ll be honest, I find the framing of the study offensive and I’m not sure if I have the words but I’ll try.
It’s less about this study comparing itself to no intervention instead, but the social & political context of AI being pushed as a way to make care giving more efficient while sacrificing quality.
I don’t personally find the framing offensive, but I’m not on the spectrum so I can’t speak to it from that perspective. My comment was less about the article and more about not offloading that work onto unsuspecting and unprepared people.
That being said, I’m not as anti-ai as maybe some other people might be when it comes to these kinds of tools. The study itself highlights the fact that not everyone has the resources to get the kind of high quality care they need and this might be an option. I agree that sacrificing quality for efficiency is bad, in my post history you can see I made that argument about ai myself, but realistically so many people can potentially benefit from this that would have no alternatives. Additionally, AI will only be getting better, and hopefully you’ve never had a bad experience with a professional, but I can speak from personal experience that quality varies drastically between individuals in the healthcare industry. If this is something that can be offered by public libraries or school systems, so that anyone with the need can take advantage, I think that would be a positive because we’re nowhere near universal physical healthcare, much less universal mental healthcare or actual social development training. I know people who cannot afford healthcare even though they have insurance, so if they were able to go to a specialized ai for an issue I would think it’s a net positive even if it’s not a real doctor. I know that ai is not there yet, and there’s a lot of political and social baggage there, but the reality is people need help and they need it now and they are not getting it. I don’t know how good this ai is, but if the alternative is telling people that are struggling and have no other options that they have to tough it out, I’m willing to at least entertain the idea. For what it’s worth, if I could snap my fingers and give everyone all the help and support they need and it excluded ai, I would choose that option, I just don’t have it. I also don’t know that LLMs really can do this successfully on a large scale, so I would need evidence of that before really supporting it, I just think it shouldn’t be written off completely if it’s showing promise.
It might get cheaper, but that doesn’t mean it’s doing a better job.
if the alternative is telling people that are struggling and have no other options that they have to tough it out
That’s just it, if you’re talking to someone who’s is struggling with this there is already a better option: showing empathy. I suspect our perceived lack of empathy is a reflection of how society treats people in general, we are just more honest about it and recognize it’s mostly platitudes.
By getting better, I mean it will be improving on itself. I never meant to indicate that it will be better than a trained professional.
I agree that showing ND people empathy is the best path forward, but realistically being able to socially signal empathy is a life skill and lacking that skill really only damages their own prospects. It’d be great if it didn’t make people less likely to be employable or less able to build a robust support network, but unfortunately that’s the case. Yes, ASD differences are often a reflection of how society treats people, but a demonstration of empathy is not a platitude. It’s an important way NT and lots of ND connect. If you think that the expression of empathy is difficult for people with ASD because they are more honest, then I think you might be equating lack of empathy with difficulty expressing it. There’s nothing dishonest about saying “I’m sorry that happened to you” unless you are not sorry it happened. It might not be something you would normally verbally express, but if hearing about a bad thing happening to someone doesn’t make you feel for them, then the difficulty isn’t expressing empathy, it’s lacking it. Society certainly does a lot of things for bad or nonsensical reasons, but expressing empathy generally isn’t one of them.
I at no point said that anyone wasn’t worth the time for personal interaction. I said multiple times that my preferred solution would not involve having to resort to AI. That’s such a bad faith interpretation of my position that I can’t imagine this being productive at this point. Best of luck.
If only they gave a control group an off-the-shelf social game like LA Noire or a D&D play group
I really don’t think a random D&D table is the place to learn to express empathy. I really wish people would stop acting like local D&D groups are a good way to learn how to socialize in general. I’m not saying you can’t learn things at the table, but the games are not actual reflections of reality and there’s a lot of go along to get along, or just run of the mill toxic group dynamics. The hobby overall can be hard for other minorities to enter, and having a table with someone still learning social skills (especially how to express empathy) and someone from a marginalized group can lead to unfortunate outcomes that your standard DM/group do not have the ability to address. It can lead one or both parties to have negative experiences that reinforce the idea they are unwelcome and leave the rest of the table with negative experiences of playing with ND people or minorities.
Sometimes practicing first with people trained to do this is the best step, and second to that would be practicing empathy in a space where the main goal is bonding rather than another nebulous goal of having fun playing a game. I don’t know if AI is the answer, but trusting your local DM/table to be able to teach empathy is a big ask. It’s almost insulting to the people that teach this and to people with ASD. Teaching empathy can’t be as passive as it is for non-ASD people, and acting like it’s just something they are expected to pick up while also dealing with all these other elements makes it seems like you don’t think it’s something they actually have to work to achieve. I’m not on the spectrum but I have a lot of autistic friends and I would not put just any of them in a D&D situation and expect them and the rest of the table to figure it out.
Also, generally comparing to an unaffected control is the gold standard. They did what is generally needed to show their approach has some kind of effect.
I’ll be honest, I find the framing of the study offensive and I’m not sure if I have the words but I’ll try.
It’s less about this study comparing itself to no intervention instead, but the social & political context of AI being pushed as a way to make care giving more efficient while sacrificing quality.
I don’t personally find the framing offensive, but I’m not on the spectrum so I can’t speak to it from that perspective. My comment was less about the article and more about not offloading that work onto unsuspecting and unprepared people.
That being said, I’m not as anti-ai as maybe some other people might be when it comes to these kinds of tools. The study itself highlights the fact that not everyone has the resources to get the kind of high quality care they need and this might be an option. I agree that sacrificing quality for efficiency is bad, in my post history you can see I made that argument about ai myself, but realistically so many people can potentially benefit from this that would have no alternatives. Additionally, AI will only be getting better, and hopefully you’ve never had a bad experience with a professional, but I can speak from personal experience that quality varies drastically between individuals in the healthcare industry. If this is something that can be offered by public libraries or school systems, so that anyone with the need can take advantage, I think that would be a positive because we’re nowhere near universal physical healthcare, much less universal mental healthcare or actual social development training. I know people who cannot afford healthcare even though they have insurance, so if they were able to go to a specialized ai for an issue I would think it’s a net positive even if it’s not a real doctor. I know that ai is not there yet, and there’s a lot of political and social baggage there, but the reality is people need help and they need it now and they are not getting it. I don’t know how good this ai is, but if the alternative is telling people that are struggling and have no other options that they have to tough it out, I’m willing to at least entertain the idea. For what it’s worth, if I could snap my fingers and give everyone all the help and support they need and it excluded ai, I would choose that option, I just don’t have it. I also don’t know that LLMs really can do this successfully on a large scale, so I would need evidence of that before really supporting it, I just think it shouldn’t be written off completely if it’s showing promise.
It might get cheaper, but that doesn’t mean it’s doing a better job.
That’s just it, if you’re talking to someone who’s is struggling with this there is already a better option: showing empathy. I suspect our perceived lack of empathy is a reflection of how society treats people in general, we are just more honest about it and recognize it’s mostly platitudes.
By getting better, I mean it will be improving on itself. I never meant to indicate that it will be better than a trained professional.
I agree that showing ND people empathy is the best path forward, but realistically being able to socially signal empathy is a life skill and lacking that skill really only damages their own prospects. It’d be great if it didn’t make people less likely to be employable or less able to build a robust support network, but unfortunately that’s the case. Yes, ASD differences are often a reflection of how society treats people, but a demonstration of empathy is not a platitude. It’s an important way NT and lots of ND connect. If you think that the expression of empathy is difficult for people with ASD because they are more honest, then I think you might be equating lack of empathy with difficulty expressing it. There’s nothing dishonest about saying “I’m sorry that happened to you” unless you are not sorry it happened. It might not be something you would normally verbally express, but if hearing about a bad thing happening to someone doesn’t make you feel for them, then the difficulty isn’t expressing empathy, it’s lacking it. Society certainly does a lot of things for bad or nonsensical reasons, but expressing empathy generally isn’t one of them.
Saying that you’re not worth the time for personal interactions but here’s a reason that’s okay is a platitude.
I at no point said that anyone wasn’t worth the time for personal interaction. I said multiple times that my preferred solution would not involve having to resort to AI. That’s such a bad faith interpretation of my position that I can’t imagine this being productive at this point. Best of luck.