Meta has revealed its newest overview of content material violations, hacking makes an attempt, and feed engagement, which incorporates the common array of stats and notes on what individuals are seeing on Fb, what individuals are reporting, and what’s getting essentially the most consideration at any time.
For instance, the Extensively Considered content material report for This autumn 2024 consists of the standard gems, like this:

Lower than superior information for publishers, with 97.9% of the views of Fb posts within the U.S. throughout This autumn 2024 not together with a hyperlink to a supply exterior of Fb itself.
That proportion has steadily crept up over the past 4 years, with Meta’s first Extensively Considered Content material report, revealed for Q3 2021, exhibiting that 86.5% of the posts proven in feeds didn’t embody a hyperlink exterior the app.
It’s now radically excessive, which means that getting an natural referral from Fb is tougher than ever, whereas Meta’s additionally de-prioritized hyperlinks as a part of its effort to maneuver away from information content material. Which it could or might not change once more now that it’s seeking to enable extra political dialogue to return to its apps. However the information, a minimum of proper now, reveals that it’s nonetheless a fairly link-averse setting.
For those who had been questioning why your Fb visitors has dried up, this could be an enormous half.
The highest ten most seen hyperlinks in This autumn additionally present the common array of random junk that’s by some means resonated with the Fb crowd.

Astronauts celebrating Christmas, Mark Wahlberg posted an image of his household for Christmas, Neil Patrick Harris sang a Christmas music. You get the concept, the standard vary of grocery store tabloid headlines now dominate Fb dialogue, together with syrupy tales of seasonal sentiment.
Like: “Youngster Asks Santa Claus to Assist Mother As a substitute of Asking For Toys”.
Candy, certain, but additionally, ugh.
The highest most shared posts general aren’t significantly better.

If you wish to resonate on Fb, you most likely may take notes from superstar magazines, because it’s that sort of fabric which seemingly features traction, whereas shows of advantage or “intelligence” nonetheless catch on within the app.
Make of that what you’ll.
When it comes to rule violations, there weren’t any notably notable spikes within the interval, although Meta did report a rise within the prevalence of Violent & Graphic Content material on Instagram attributable to changes to its “proactive detection expertise.”

This additionally looks like a priority:

Additionally value noting, Meta says that faux accounts “represented roughly 3% of our worldwide month-to-month lively customers (MAU) on Fb throughout This autumn 2024.”
That’s solely notable as a result of Meta often pegs this quantity at 5%, which has seemingly grow to be the trade normal, as there’s no actual method to precisely decide this determine. However now Meta’s revised it down, which may imply that it’s extra assured in its detection processes. Or it’s simply modified the bottom determine.
Meta additionally shared this attention-grabbing notice:
“This report is for This autumn 2024, and doesn’t embody any information associated to coverage or enforcement adjustments made in January 2025. Nevertheless, we’ve been monitoring these adjustments and up to now we’ve not seen any significant influence on prevalence of violating content material regardless of not proactively eradicating sure content material. As well as, we’ve seen enforcement errors have measurably decreased with this new method.”
That change is Meta’s controversial swap to a Neighborhood Notes mannequin, whereas eradicating third social gathering fact-checking, whereas Meta’s additionally revised some its insurance policies, notably regarding hate speech, transferring them extra into line, seemingly, with what the Trump Administration would like.
Meta says that it’s seen so main shifts in violative content material because of this, a minimum of not but, however it’s banning fewer accounts by mistake.
Which sounds good, proper? Sounds just like the change is best already.
Proper?
Properly, it most likely doesn’t imply a lot.
The truth that Meta is seeing fewer enforcement errors makes excellent sense, because it’s going to be enacting quite a bit much less enforcement general, so after all, mistaken enforcement will inevitably lower. However that’s not likely the query, the actual difficulty is whether or not rightful enforcement actions stay regular because it shifts to a much less supervisory mannequin, with extra leeway on sure speech.
As such, the assertion right here appears roughly pointless at this stage, and extra of a blind retort to those that’ve criticized the change.
When it comes to risk exercise, Meta detected a number of small-scale operations in This autumn, originating from Benin, Ghana, and China.
Although probably extra notable was this explainer in Meta’s overview of a Russian-based affect operation referred to as “Doppleganger”, which it’s been monitoring for a number of years:
“Beginning in mid-November, the operators paused focusing on of the U.S., Ukraine and Poland on our apps. It’s nonetheless centered on Germany, France, and Israel with some remoted makes an attempt to focus on individuals in different nations. Primarily based on open supply reporting, it seems that Doppelganger has not made this similar shift on different platforms.”
Evidently after the U.S. election, Russian affect operations stopped being as excited by influencing sentiment within the U.S. and Ukraine. Looks like an attention-grabbing shift.
You’ll be able to learn all of Meta’s newest enforcement and engagement information factors in its Transparency Middle, should you’re seeking to get a greater understanding of what’s resonating on Fb, and the shifts in its security efforts.