The og:image of the website has been changed recently. The site contains more than 100 pages, each containing their individual og:image. How can i ask or force facebook to re-scrape all the pages, so that the image gets updated? Using the facebook debugger tool will be too tedious task. Until facebook re-scrape's the site, i won't be able to submit collection for the app.
By default, Facebook scrapes each link every 30 days (source). This leads to two potential problems: If there are issues with the Open Graph meta tags in your content (or if you're not using a plugin that adds Open Graph meta tags), you might see the wrong image or title when someone shares your link on Facebook.
You can force a re-scrape via API, as described here: https://developers.facebook.com/docs/opengraph/using-objects/#update:
POST /?id={object-instance-id or object-url}&scrape=true
(But if you don’t have a real list of the affected URLs, this is kinda moot. Then you only can wait until it happens automatically, I guess.)
You have two alternatives here in your situation
As per ysrb's answer loop through list of your URLs with the Open Graph Debugger tool
Or wait patiently for 30 days until Facebook re-scrapes your Pages as it says in documentation here
Why and when does Facebook scrape my website?
Facebook needs to scrape links shared to Facebook to know which to link preview information to show on Facebook.com or on Facebook for iOS and Android. This happens every 30 days to ensure the properties are up to date. The linked page is also scraped when the URL is entered into the Debugger Tool.
Facebook observes cache headers on your URLs - it will look at "Expires" and "Cache-Control" in order of preference. However, even if you specify a longer time, Facebook will scrape your page every 30 days.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With