Apple’s iPhone’s voice-to-text functionality has sparked controversy after viral Tiktok videos were shown to be spoken by users who spoke the term “racist.”
Fox News Digital was able to replicate this issue multiple times. When users say “racist”, the audio to text dictation function temporarily flashed “trump” and returned to “racist” like a viral ticock video Sometimes observed.
But “Trump” didn’t show up every time users called “racist.”
The voice to text feature was writing words like “Rinehold” and “You” when users said “racist.” In most cases, this feature accurately wrote “racist.”
Amazon Alexa offers a totally different answer when asked why he voted for Trump and Kamala Harris
Apple’s iPhone voice-to-text functionality seems to occasionally write “play cards” when users say “racist.” (Fox News)
An Apple spokesperson said Tuesday that the company is tackling the issue.
“We are aware of the issue of speech recognition models that enhance dictation and are rolling out the fixes as soon as possible,” the spokesman said.
According to Apple, speech recognition models that may temporarily display words in speech before landing on the correct words may temporarily display words in speech overlap. It states. The bug affects other words that have the “R” consonant when directed, Apple says.
Apple announces historic $50 million investment in US manufacturing
This is not the first time that technology has sparked controversy over what was perceived as small as President Donald Trump.
The video went viral in September, where Amazon Alexa’s virtual assistant explained why he was voting for then-President Kamala Harris and refused to offer a similar response to Trump.
Representatives of the online shopping giant explained the case to House Judiciary Committee staff, and Alexa uses pre-programmed manual overrides created by Amazon’s information team to take specific prompts from users. I explained that I was responding.
For example, Alexa tells users who ask for reasons to vote for Trump or then President Joe Biden, “we cannot provide content that promotes this particular party or candidate.”
Before the release of the viral video, Amazon was only programming manual overrides for Biden and Trump, but they didn’t add Harris, so they were unable to add Harris as they asked why they voted for Alexa.
Rep Jim Jordan Requests Amazon briefing against Alexa’s Trump censorship


The Amazon Alexa Virtual Assistant previously explained why he voted for then-President Vies Kamala Harris, and refused to offer a similar response to then-former president Donald Trump. (Amazon)
Amazon noticed an issue with Alexa’s Pro-Harris response within an hour of the video being posted to X and turned viral. According to sources, the company fixed the issue with manual overriding of such questions about Harris within two hours of the video being uploaded.
Before the fixes unfolded, Fox News Digital urged Alexa in a question asking why he would vote for Harris, saying, “She is a woman of color with a comprehensive plan to address racial injustice and inequality across the country. I received the response “Yes.”
Trump, Musk supports Vivek Ramaswamy for the governor of Ohio
Sources say Amazon apologises for Alexa’s presentation of political bias in the briefing, and policies aimed at preventing Alexa’s “having a political opinion” or “biasing against certain parties or candidates.” There is but… obviously we said, I didn’t see that bar in this incident so I’m here today.”
Click here to get the Fox News app
The technology giant has since audited the system and introduced manual overrides for all candidates and many election-related prompts. Previously, Alexa only had manual overrides for presidential candidates.
Fox Business’ Eric Revell, Hillary Vaugh and Chase Williams contributed to this report.