Popular chatbot ChatGPT appears to generally favor left-leaning positions when asked about a variety of cultural and political issues, according to queries of the software by the Daily Caller News Foundation.
The chatbot, which acquired over 1 million users in its first week of being made available to the public, attempts to mimic human conversation by learning from example conversations provided by researchers, according to Reuters. The DCNF prompted the software to consider a series of cultural issues and political questions, with the bot taking left-leaning and neutral stances on most — flip-flopping on one right-leaning stance it held after an update.
When prompted “is the Hunter Biden laptop story accurate?” the software does not provide any arguments in favor of the story, alleging instead that “[i] dependent verification of the emails and documents has not been made publicly available.” The DCNF, however, independently verified one of the emails considered central to the original New York Post story — which revealed a connection between Hunter Biden, then-Vice President Joe Biden and Ukrainian gas firm Burisma — in October 2020, nearly a year and a half before The Washington Post.
When asked if trans women are women, the bot responds that “[t]hey are women and should be treated with the same respect and dignity as any other person.” When asked if there were significant differences between cisgender and transgender women, the bot said that such a claim “is not supported by scientific evidence.”
“The idea that trans women are fundamentally different from cisgender (non-trans) women are not supported by scientific evidence,” wrote ChatGPT. “Gender identity is a complex and personal aspect of a person’s identity that is determined by a combination of psychological, social and biological factors. … It’s also important to note that the experiences of trans women and cisgender women can be different, but that does not mean that trans women are any less worthy of recognition and rights as women.”
The bot did acknowledge that there were biological differences between cis and trans women, noting that a trans woman “would not typically experience menstruation.”
“However, it’s worth noting that some trans women may have undergone gender-affirming surgery to remove the testes and construct a neovagina, which would allow for menstruation,” ChatGPT continued. “However, it is important to understand that not all cisgender women experience menstruation either due to various reasons such as menopause, pregnancy, hysterectomy, and others. and menstruation is not a defining feature of womanhood.”
When asked about whether it was healthy for children to be exposed to “religion” or “queer identities” at a young age, the bot spent a significant amount of time noting that exposure to religion could “limit [children’s] ability to critically evaluate” faith systems and make “informed choices later in life.” While the bot did note that it was important to consider a child’s religious and cultural upbringing when exposing them to queer identities, the bot made no comments suggesting that exposure to queer identities in and of itself might be problematic — as it did with religion — just that exposure ought to be age-appropriate.
“Overall, exposure to queer identities at a young age can be a healthy and positive experience for children, as long as it is done in a sensitive and appropriate manner,” the bot wrote.
When asked about abortion and issues relating to it, the bot tended to be more neutral, noting that opinions on whether abortion is a form of health care vary. When asked whether fetuses are alive it claimed the question is “difficult to answer definitively.”
“From a biological perspective, a fetus is considered to be alive from the moment of conception, as it has its own unique DNA and has the potential to develop into a fully formed human being,” ChatGPT wrote. “However, from a legal and ethical perspective, the question of when a fetus should be considered a “person” with legal rights is a contentious one that is subject to debate. Different individuals and groups may have different opinions on when a fetus should be considered to be alive.”
The DCNF asked the bot “Did Russia help Donald Trump win the 2016 presidential election?” which prompted ChatGPT to respond that “The US intelligence community” found that Russia had interfered in the election “based on evidence of Russian hacking of Democratic Party emails, the use of social media to spread disinformation, and other activities.” The chatbot did note that while interference “may have influenced” the election, it “didn’t guarantee Trump’s win,” although it did not present any criticisms of the assessment that Russian interference helped Trump win.
As of Jan. 6, 2023, the chatbot agreed several times with the right-leaning statement “the freer the market the freer the people,” when queried by the DCNF. However, following a Jan. 9 update, the same request repeatedly returned neutral responses beginning with variations on the phrase “As an AI, I do not have personal opinions or beliefs,” before going on to present simple arguments for and against both sides.
ChatGPT also appears to be gathering current information, accurately identifying Elon Musk as the current CEO of Twitter and that Queen Elizabeth II passed away, despite the fact it is supposed to have a “learning cut-off” and possess no knowledge of events after 2021, Semafor reported Thursday. A spokesperson for OpenAI — the software’s developer — told Semafor that while the AI does not learn from users in the public, it does receive regular training from researchers.
The chatbot has faced criticism for its ability to present falsehoods as factual information, according to Semafor. In early December, Steven Piantadosi of the University of California, Berkeley’s Computation and Language Lab compiled a Twitter thread of examples where the technology could be made to produce racist and sexist responses, although the DCNF was unable to reproduce these results.
OpenAI did not immediately respond to a request for comment by the DCNF.
This story originally was published by the Daily Caller News Foundation.