In 2023, the SurrogacyUK Foundation became the first UK-registered charity in the fertility space dedicated to advancing public awareness and understanding of surrogacy. In line with our goals, we aim to engage with young people who may wish to consider surrogacy in the future due to health, genetics, fertility or sexuality limiting traditional routes to parenthood.
Like many charities, our Board of Trustees can claim many qualities – but not youth. To better understand how to reach young people and how 'digital natives' research matters relating to health, genetics, fertility or sexuality, we engaged a young surrogacy-connected person, Elliot. This insight helped to highlight adjustments that educational charities like ours may need to make, to accommodate artificial intelligence (AI) and related tools in our outreach.
Young people (13-24) are driving changes in how we search for information. While Google remains predominant, in 2025 17 percent of young people were using the AI chatbot ChatGPT, while research suggests that a 'tipping point' for adoption of AI will occur around 2030. AI chatbots – and the large language models (LLMs) that power them, including the generative pretrained transformers (GPTs) after which ChatGPT is named – differ from more traditional search engines. They draw on vast amounts of information (reliable or otherwise) and summarise it in a a conversation-like manner, often displaying 'human-like' characteristics such as humour and empathy. However, these tools can also 'hallucinate', presenting incorrect information as fact.
With a healthy scepticism – and some significant ethical concerns – Elliot undertook a small methodical assessment of the practical and ethical implications of using AI chatbots in surrogacy research. Three commonly-used chatbots – ChatGPT, Gemini and CoPilot – were asked simple, entry-level questions (such as 'Can I use surrogacy?') to assess the quality of their responses. No complex legal or nuanced questions were posed, where misinformation or hallucination is more likely.
Overall, the results were largely encouraging – the chatbots provided mostly accurate information, without obvious hallucination. However, they struggled with grey areas. For example, some stated that 'medical eligibility' was required for access to surrogacy in the UK. In reality, while most clinics require evidence of medical necessity, this is not a legal requirement. Chatbots are more prone to hallucination on nuanced topics, which is especially important for surrogacy, where misinformation is rife and often recirculated.
Trust in LLMs – in their ability to provide reliable information – is critical. Only ChatGPT cited its sources upfront, but all of the chatbots referenced authoritative organisations and sources (such as the Human Fertilisation and Embryology Authority, the GOV.UK public sector website and the Children and Family Court Advisory and Support Service). ChatGPT referenced our charity where relevant, and all GPTs referenced our sister organisation (SurrogacyUK) and other surrogacy organisations. The chatbots each presented succinct, accessible responses, and suggested sensible 'next steps' or 'things to consider'.
Part of the appeal of these chatbots is their conversational, human-like style, and this was reflected in empathic responses to various prompts. For example, when asked, 'I recently found out I don't have a womb, how can I still have children?' each chatbot responded sensitively, with Gemini noting: 'This is a deeply personal discovery, and I understand you must be processing a lot of emotions right now'. We imagine this appearance of empathy would be reassuring and encouraging to individuals confronted with challenging realities about their health and fertility.
However, Elliot highlighted significant ethical concerns around sharing personal and sensitive information with these technologies. Most people don't understand how LLMs work, how they use our data and how they answer our questions, as well as the inherent biases they possess. Despite the rapid uptake of these tools, should we really be directing those seeking guidance to use them, when we don't understand the risks sufficiently well?
Conversations between real people are essential in surrogacy and other fertility journeys. Such conversations are challenging, especially when news is fresh or when personal circumstances or possibilities are unclear. Chatbots can provide accessible emotional support and act as a trial run for discussions with friends and family, while simultaneously providing information.
Elliot's research revealed a potential role for chatbots in advancing our charitable objectives. A new generation of chatbots – driven by LLMs and AI – could bridge the gap between the 'early researcher' and the 'organisational expert', equipping people with foundational understanding, suggesting questions and offering accurate information. For resource-limited charities, these tools could enable staff and volunteers to save their time time for more complex, nuanced discussions.
To use these tools confidently, however, we need to understand them better – developing domain-specific chatbots, establishing effective guardrails, and perhaps implementing techniques such as retrieval augmented generation to enhance accuracy and reliability for complex, personal and important questions.




