More threads by AndySimpson

AndySimpson

Local Search Expert
LocalU Faculty
Joined
Apr 6, 2016
Messages
238
Reaction score
282
Working with @Amy Toman we've found a few images that might cause a GMB post to be filtered.

Here's a quick discussion on Twitter we had today which may help explain the issue.

Amy's Tweet about rejected images in GMB Posts

I then suggested using Google's Vision AI tool which may help you decide if your image is "Racy" or not.

My response and findings after using Google's Vision AI

So, if in doubt "Try the API", it's probably best to have "very unlikely" indicators for your images you use in GMB Posts.

try_the_api.png
 
Great tip, Andy! Thanks for sharing!!
 
These photos were posted on GMBs for attorneys who specialize in abuse and harassment cases. The post texts was relatively benign, but the photos were rejected with no rationale noted.

I'd like to know more about how these are judged by Google. As asked in the Twitter thread with Tim Capper, are images judged on their own merits? Or in conjunction with the accompanying text? (We had two images rejected. Only one (in my opinion) could be judged on its own merits. The other appeared to need the context provided by the post)

A. I wish there were clearer guidelines on posting images in GMB posts. Is there any info in The Guide on this? Would love it to be included.

B. I hope this doesn't prove problematic for other industries that provide resources to abuse victims / survivors.
 
Honestly i dont think they're judged at all - I think they have an algo (like vision AI) that scans skin-to-clothing/everything else ratio and if any flags come up (racy, adult, violence) they kick them back out.

I've seen lingerie images get through that leave pretty much NOTHING to the imagination - but because there were "clothes" that covered skin...they were ok. I dont think there's a lot of thought that goes into this beyond the AI and an algorithm to kick them out.
 
I've seen lingerie images get through that leave pretty much NOTHING to the imagination - but because there were "clothes" that covered skin...they were ok. I don't think there's a lot of thought that goes into this beyond the AI and an algorithm to kick them out.

Carrie, would you mind doing a quick lingerie Vision AI scan on a "typical" lingerie item and see if it flags "Racy" or "Adult" please, thank you.

My search history is bad enough with the work I do, I'd like to avoid seeing adverts for lingerie in my FB feed for the next few days...:sneaky:
 
@AndySimpson I just did this one - it shows up as just as racy as your pic, it shows up as less "adult" than your pic. Not sure if it would hit the filter - but its decidedly more "adult" than the kid swimming pic. I added the black bar after I scanned it - nobody needs to see that this early in the morning :)
racy.png
 
Thank you Carrie, okay, so I wonder if you could post this? Racy "Possible" but Adult "Very Unlikely" so does that mean Google Posts allows "Racy" images... 🤷‍♂️

Is there anyone out there that can test this for us please?

1. Be careful
2. Use Vision AI first
3. Note "safe search" results
4. Post and see if it sticks...
 
Update: so today I am posting for an attorney with two bricks-and-mortar locations. I posted on sex crimes defense for the first location and all was fine. I posted for the second location, and the image was rejected. I changed the file name from (oops) Sex Crime Defense to Attorney Consultation, reposted it with the same text, and it was accepted.

Two observations: is this an indication that file names are considered, and how is this image considered racy?

1580999735776.png
 
I do think filename is considered if it has words on the nanny list. My $.02
 
Edit / addition: this is the text that accompanies both posts:

Sex crimes are some of the most serious allegations you can face. Not only can they carry extensive penalties, but they also have long-lasting consequences for all aspects of your life. If convicted, you are forever branded as an offender, and, in most cases you must register as one for the rest of your life. This could result in public embarrassment, job loss, and lifelong stigma for you and your family. My name is xxxxxxxx, and I provide my clients with the serious, tough defense that this type allegation requires. At xxxxxx, Attorney at Law, my clients receive trustworthy representation from an experienced [location] criminal defense attorney during a sex crime case. Call us ☎️ in [location]to get started today.
 
I think the file name in an IMAGE is considered because of the nanny list. They're worried it will be a nasty pic, and their AI is not as fine-tuned as their text algo.

They can put the words "sex," "crimes," "lawyer," and "attorney" together and get a pretty good idea that the text is nothing nefarious - but they still struggle with photos...its a young AI and will grow and learn the more it's used/developed.

Certainly not a perfect system - and I've heard that posts are esp. borked right now - so it could be a combo of things that lets one through and gets the other kicked back.
 
I hope you're right, @CarrieHill, about the system being borked. Because this just happened.

I am loading GMB posts for one law firm, multiple legit locations. First four, no problem on anything. On number five, the image below was rejected. I changed the file name from Hospital Errors to Clibboard and it was accepted. Here's the text and image I used. It is beyond me how a photo of a clipboard could be considered "racy."

Hospitals can be dangerous places. According to the National Institute of Health, approximately 1 in 25 patients at acute care hospitals will develop another illness or infection during their stay. Many times, it is the negligence of the medical personnel who increases your risk of becoming seriously ill or injured while in an emergency room, hospital or healthcare facility. At xxxx. we advocate on behalf of people who suffered further harm in a [location 1] hospital or healthcare institution. For the last 30 years, our skilled team of personal injury and medical malpractice attorneys has protected the rights of victims throughout the state. We are prepared to handle difficult or complex cases. If you were hurt in a [location1 ] hospital, or if your emergency care caused you harm, we want to hear your story. Call us ☎️ in [location 2]today to get started.

1581004074886.png


1581004008873.png
 
Well that naked hand holding a pen is pretty racy, I guess... :oops:

But what caught my eye was the "Medical - Unlikely" entry. Looks like a stock photo rather than an actual photo of the business. But what does "unlikely" mean here? That it's unlikely to be a NSFW /photo site?
 
Probably just matching on skin tone colors in this case. Filename changes to pass through the filter is interesting though.
 
I am more inclined to think it has to do with stock photos. Which to GMB are considered low quality. We have had a ton of employment lawyer post rejected, and after we replace stock photos with more "real" looking photos they are approved.
 
Proprietary image, low risk on all fronts - still rejected. Could it be that competitors are going in, reporting, and Google is just wildly accepting these reports without much thought?

1581099611380.png


1581099592891.png
 
A user on the GMB forum suggested creating the post without an image and then edit the post and add the image. Honestly, I am hearing so many different solutions to work around it, that I am more and more convinced that it is a technical issue that is super temperamental.

Personally, I think the entire post feature needs to be killed off by Google. That’s for another day and another topic discussion.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

LocalU Webinar

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom