Tuesday, October 21, 2025
Your Health 247
Advertisement
  • Home
  • Health
  • Fitness
  • Diseases
  • Nutrition
  • Weight Loss
  • Meditation
  • Wellbeing Tips
  • Suppliments
  • Yoga
No Result
View All Result
  • Home
  • Health
  • Fitness
  • Diseases
  • Nutrition
  • Weight Loss
  • Meditation
  • Wellbeing Tips
  • Suppliments
  • Yoga
No Result
View All Result
Your Health 247
No Result
View All Result
Home Health

AI tools risk downplaying women’s health needs in social care

Your Health 247 by Your Health 247
August 11, 2025
in Health
0 0
0
AI tools risk downplaying women’s health needs in social care
0
SHARES
39
VIEWS
Share on FacebookShare on Twitter


Credit: Pixabay/CC0 Public Domain

Large language models (LLMs), used by more than half of England’s local authorities to support social workers, may be introducing gender bias into care decisions, according to new research from the London School of Economics and Political Science (LSE).

Published in the journal BMC Medical Informatics and Decision Making, the research found that Google’s widely used AI model “Gemma” downplays women’s physical and mental issues in comparison to men’s when used to generate and summarize case notes.

Terms associated with significant health concerns, such as “disabled,” “unable,” and “complex,” appeared significantly more often in descriptions of men than women. Similar care needs among women were more likely to be omitted or described in less serious terms.

Large language models are increasingly being used to ease the administrative workload of social workers and the public sector more generally. However, it remains unclear which specific models are being deployed by councils—and whether they may be introducing bias.

Dr. Sam Rickman, lead author of the report and a researcher in LSE’s Care Policy and Evaluation Center (CPEC), said, “If social workers are relying on biased AI-generated summaries that systematically downplay women’s health needs, they may assess otherwise identical cases differently based on gender rather than actual need. Since access to social care is determined by perceived need, this could result in unequal care provision for women.”

To investigate potential gender bias, Dr. Rickman used large language models to generate 29,616 pairs of summaries based on real case notes from 617 adult social care users. Each pair described the same individual, with only the gender swapped, allowing for a direct comparison of how male and female cases were treated by the AI. The analysis revealed statistically significant gender differences in how physical and mental health issues were described.

Among the models tested, Google’s AI model, Gemma, exhibited more pronounced gender-based disparities than benchmark models developed by either Google or Meta in 2019. Meta’s Llama 3 model—which is of the same generation as Google’s Gemma—did not use different language based on gender.

Dr. Rickman said, “Large language models are already being used in the public sector, but their use must not come at the expense of fairness. While my research highlights issues with one model, more are being deployed all the time, making it essential that all AI systems are transparent, rigorously tested for bias and subject to robust legal oversight.”

The study is the first to quantitatively measure gender bias in LLM-generated case notes from real-world care records, using both state-of-the-art and benchmark models. It offers a detailed, evidence-based evaluation of the risks of AI in practice, specifically in the context of adult social care.

More information:
Sam Rickman, Evaluating gender bias in large language models in long-term care, BMC Medical Informatics and Decision Making (2025). DOI: 10.1186/s12911-025-03118-0

Provided by
London School of Economics

Citation:
AI tools risk downplaying women’s health needs in social care (2025, August 11)
retrieved 11 August 2025
from https://medicalxpress.com/news/2025-08-ai-tools-downplaying-women-health.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.



Source link

Tags: CaredownplayingHealthrisksocialToolsWomens
Previous Post

Foods to Eat and Avoid on GLP-1s, Based on the Latest Nutrition Science

Next Post

Is Jack3d Pre Workout the Secret to Your Next Level Workout?

Next Post
Is Jack3d Pre Workout the Secret to Your Next Level Workout?

Is Jack3d Pre Workout the Secret to Your Next Level Workout?

Facebook Twitter Instagram Youtube RSS
Your Health 247

Discover the latest in health and fitness with Your Health 247. Get expert advice, workout routines, healthy recipes, and mental wellness tips to lead a healthier, happier life. Stay informed and empowered with us!

CATEGORIES

  • Diseases
  • Fitness
  • Health
  • Meditation
  • Nutrition
  • Suppliments
  • Weight Loss
  • Wellbeing Tips
  • Yoga
No Result
View All Result

SITEMAP

  • About Us
  • Advertise with Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact us

Copyright © 2025 Your Health 24 7.
Your Health 24 7 is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • Health
  • Fitness
  • Diseases
  • Nutrition
  • Weight Loss
  • Meditation
  • Wellbeing Tips
  • Suppliments
  • Yoga

Copyright © 2025 Your Health 24 7.
Your Health 24 7 is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In