Christina Pellegrini
May 13, 2017 11:51 am
EmiliaUngur/Shutterstock

If there’s one thing pregnant women (and all women) get sick of, it’s advice. And thanks to the age of technology, we now also have to worry about robots joining the chorus of unnecessary wisdom, along with church ladies and that one dude at work who has a baby cousin.

You probably don’t need us to tell you that any advice offered by a bot should be taken with a grain of salt. But here we are, with a particularly troubling situation. Apparently there’s a bot out there who is offering nutrition advice to pregnant women. AND IT’S GETTING EVERYTHING ALL WRONG.

Lifehacker’s Beth Skwarecki discovered the misinformed bot when she saw this tweet across her feed:

Intrigued, Beth started asking the bot some pretty normal pregnancy nutrition questions. But she got some troubling answers back.

First, it told her she should avoid deli meat because it contains a high risk of listeria (true). But also, that they should be avoided unless they do not contain nitrites or nitrates (not true). FYI, nitrites and nitrates should be fine for everyone.

Doesn’t sound like a super big deal yet, right? Maybe it’s just erring on the side of caution?

Well what about when the bot said that eggplants “contain large amounts of phytohormones which can stimulate menstruation when consumed daily”? This is NOT TRUE.

And how about this? When Beth asked if she could eat mushrooms, the bot said, “Haaayyyllll naw…Raw mushrooms are also a big no since they are carcinogenic.” ALSO NOT TRUE.

 Ummm what is going on here?! This is a particularly important thing to get right. Who allowed this to happen?!

Apparently, the bot’s maker, reply.ai, cited its sources as the FDA, Mayo Clinic, and Parents.com. But when Beth dug deeper, they also admitted to using MomJunction, an India-based site with some questionable advice that does not cite sources.

Moral of the story? Steer clear of this bot advice in general. While the tech is promising, it’s clearly not all there just yet. There’s correct info out there, we promise!

(H/T to Lifehacker)

Advertisement