You are here
In the past, comparing ad performance across media platforms has been a challenge for brands. Advertisers have asked us to help them better understand the effectiveness of Facebook ads compared to other forms of advertising—particularly TV ads. Advertisers are also looking for ways to unify metrics and reporting. In fact, 79% of marketers say they would prefer to use one set of metrics across all screens.(1)
Today, we’re introducing new measurement solutions that will enable advertisers to see the impact of their ad campaigns on both Facebook and TV, as well as the incremental impact of both platforms when used together. Nielsen Total Brand Effect with Lift and Facebook Cross-Platform Brand Lift both provide advertisers with the ability to evaluate the impact of their campaigns across Facebook and TV.Two ways to run a cross-platform brand lift study
Nielsen Total Brand Effect with Lift: Available today, through our partnership with Nielsen, advertisers can interpret their cross-platform results as measured by Nielsen. This solution leverages Nielsen’s expertise in television measurement and its database of television programming to poll for TV ad exposure. Results are delivered independently by Nielsen. Nielsen Total Brand Effect with Lift is available now for eligible advertisers via their Facebook representative. For a more holistic view of effectiveness advertisers can couple their brand measurement with cross platform reach measurement Nielsen Total Ad Ratings (TAR). TAR and Total Brand Effect give the marketer a read on the relative reach and efficiency of spend across an entire campaign.
The product is currently available in both the US and the UK, and will also be available in Australia by the end of the year.
Results from a Nielsen Total Brand Effect with Lift Campaign can be found in the section Facebook and TV ads work better together.
Facebook Cross-Platform Brand Lift: Facebook will offer polling and results from cross-platform brand lift measurement for ads on Facebook, Instagram and Audience Network. These studies will be available for lower spend minimums than measurement through our partners and will also offer self-serve reporting, including that of Facebook usage during commercial breaks. We are starting to build this solution now, and hope to make it available to advertisers in early 2018.
This graphic is provided for illustration purposes only. Exact reporting details are subject to change.Determine brand lift generated by each platform
These cross-platform brand lift solutions combine lift measurement (a measurement of test vs. control), with an opportunity-to-see methodology for determining TV ad exposure. Opportunity-to-see assesses the likelihood that those people were exposed to a given ad by asking those individuals if they have seen certain TV programs in which the ad was aired. After being asked a question that determines whether they viewed certain TV programs in the previous day, individuals are then asked questions to determine their perception of the advertiser’s brand. Comparing lifts in the exposed groups with the unexposed group gives marketers a more complete understanding of how their Facebook and TV ads drive impact independently and together.
Optimize your Facebook and TV campaigns using actionable results
At the end of a cross-platform brand lift study, advertisers will get information on lift generated by Facebook alone, lift generated by TV alone and lift generated by Facebook and TV together.
This will help advertisers understand if they are effectively and efficiently engaging their audience on each platform. Using these studies, advertisers will be able to improve future campaign performance by optimizing creative, increasing on-target reach (with Nielsen Total Ad Ratings) and spending their media budget more efficiently across platforms.Facebook and TV ads work better together
Advertisers are already seeing the value of these brand lift solutions. For example, Shark, a pioneer in small household appliances and cleaning solutions, wanted to understand how TV and Facebook generate brand awareness both together and separately, so they could determine the efficacy and efficiency of their overall advertising strategy.
Shark’s campaign helped establish the effectiveness of a combined TV and Facebook video ad strategy.
From April 24 to June 18, 2017, a Nielsen Total Brand Effect with Lift study measured this incremental success. The study revealed these results:
- A 22-point lift in ad recall for TV and Facebook (versus a 6-point lift for TV-only and a 3-point lift for Facebook-only)
- An 8-point lift in purchase intent for TV and Facebook (versus a 6-point lift for TV-only and a 1-point lift for Facebook-only)
- A 6-point lift in awareness for TV and Facebook (versus a 3-point lift for TV-only and a 3-point lift for Facebook-only)
“We proved that Facebook video ads are a natural complement to TV campaigns. We experienced better brand results among people who saw ads on both versus just TV or Facebook alone. We saw the ‘better together’ impact first-hand. Facebook and TV are powerful individually, but deliver a stronger message to our audience when used in tandem.” —Ajay Kapoor, VP, Digital Transformation & Strategy, SharkNinja
Additionally, our advertising partners, including those who are expanding from digital advertising into cross-media campaigns, have expressed excitement about the opportunity to leverage Facebook Cross-Platform Brand Lift solution.
“Now that Buzzfeed has begun to diversify our media strategies to include both Television and Digital, having the option to leverage solutions such as Facebook’s Cross-Platform Brand Lift and Nielsen Total Brand Effect with Lift presents a great opportunity. We look forward to using cross-platform brand lift measurement to both receive valuable insights about our multi-media campaign performance in a single reporting surface, and also to optimize campaign elements such as spend and creative across both platforms.” — Margo Arton, Senior Director of Ad Effectiveness at BuzzFeed
Advertisers now have the opportunity to measure the impact of their Facebook and TV campaigns together using Nielsen Total Brand Effect with Lift, and we look forward to rolling out the Facebook Cross-Platform Brand Lift solution in the coming months.
By Elliot Schrage, Vice President of Policy and Communications
1) Why did Facebook finally decide to share the ads with Congress?
As our General Counsel has explained, this is an extraordinary investigation — one that raises questions that go to the integrity of the US elections. After an extensive legal and policy review, we’ve concluded that sharing the ads we’ve discovered with Congress, in a manner that is consistent with our obligations to protect user information, will help government authorities complete the vitally important work of assessing what happened in the 2016 election. That is an assessment that can be made only by investigators with access to classified intelligence and information from all relevant companies and industries — and we want to do our part. Congress is best placed to use the information we and others provide to inform the public comprehensively and completely.
2) Why are you sharing these with Special Counsel and Congress — and not releasing them to the public?
Federal law places strict limitations on the disclosure of account information. Given the sensitive national security and privacy issues involved in this extraordinary investigation, we think Congress is best placed to use the information we and others provide to inform the public comprehensively and completely. For further understanding on this decision, see our General Counsel’s post.
3) Let’s go back to the beginning. Did Facebook know when the ads were purchased that they might be part of a Russian operation? Why not?
No, we didn’t.
The vast majority of our over 5 million advertisers use our self-service tools. This allows individuals or businesses to create a Facebook Page, attach a credit card or some other payment method and run ads promoting their posts.
In some situations, Facebook employees work directly with our larger advertisers. In the case of the Russian ads, none of those we found involved in-person relationships.
At the same time, a significant number of advertisers run ads internationally, and a high number of advertisers run content that addresses social issues — an ad from a non-governmental organization, for example, that addresses women’s rights. So there was nothing necessarily noteworthy at the time about a foreign actor running an ad involving a social issue. Of course, knowing what we’ve learned since the election, some of these ads were indeed both noteworthy and problematic, which is why our CEO today announced a number of important steps we are taking to help prevent this kind of deceptive interference in the future.
4) Do you expect to find more ads from Russian or other foreign actors using fake accounts?
When we’re looking for this type of abuse, we cast a wide net in trying to identify any activity that looks suspicious. But it’s a game of cat and mouse. Bad actors are always working to use more sophisticated methods to obfuscate their origins and cover their tracks. That in turn leads us to devise new methods and smarter tactics to catch them — things like machine learning, data science and highly trained human investigators. And, of course, our internal inquiry continues.
It’s possible that government investigators have information that could help us, and we welcome any information the authorities are willing to share to help with our own investigations.
Using ads and other messaging to affect political discourse has become a common part of the cybersecurity arsenal for organized, advanced actors. This means all online platforms will need to address this issue, and get smarter about how to address it, now and in the future.
5) I’ve heard that Facebook disabled tens of thousands of accounts in France and only hundreds in the United States. Is this accurate?
No, these numbers represent different things and can’t be directly compared.
To explain it, it’s important to understand how large platforms try to stop abusive behavior at scale. Staying ahead of those who try to misuse our service is an ongoing effort led by our security and integrity teams, and we recognize this work will never be done. We build and update technical systems every day to make it easier to respond to reports of abuse, detect and remove spam, identify and eliminate fake accounts, and prevent accounts from being compromised. This work also reduces the distribution of content that violates our policies, since fake accounts often distribute deceptive material, such as false news, hoaxes, and misinformation.
This past April, we announced improvements to these systems aimed at helping us detect fake accounts on our service more effectively. As we began to roll out these changes globally, we took action against tens of thousands of fake accounts in France. This number represents fake accounts of all varieties, the most common being those that are used for financially-motivated spam. While we believe that the removal of these accounts also reduced the spread of disinformation, it’s incorrect to state that these tens of thousands of accounts represent organized campaigns from any particular country or set of countries.
In contrast, the approximately 470 accounts and Pages we shut down recently were identified by our dedicated security team that manually investigates specific, organized threats. They found that this set of accounts and Pages were affiliated with one another — and were likely operated out of Russia.
By Colin Stretch, General Counsel
Two weeks ago, we announced we had found more than 3,000 ads addressing social and political issues that ran in the US between 2015 and 2017 and that appear to have come from accounts associated with a Russian entity known as the Internet Research Agency. We subsequently made clear that we are providing information related to those ads, including the ad content itself, to the Special Counsel investigating allegations of Russian interference in the 2016 US election. Since then, some people have asked why we aren’t sharing the content of the ads more broadly.
After an extensive legal and policy review, today we are announcing that we will also share these ads with congressional investigators. We believe it is vitally important that government authorities have the information they need to deliver to the public a full assessment of what happened in the 2016 election. That is an assessment that can be made only by investigators with access to classified intelligence and information from all relevant companies and industries — and we want to do our part. Congress is best placed to use the information we and others provide to inform the public comprehensively and completely.
This has been a difficult decision. Disclosing content is not something we do lightly under any circumstances. We are deeply committed to safeguarding user content, regardless of the user’s nationality, and ads are user content. Federal law also places strict limitations on the disclosure of account information. As our biannual transparency reports make clear, we carefully scrutinize all government data requests, from here and abroad, and we push back where they do not adhere to those legal limitations. And, of course, we also recognize and support the important work of government investigations and take care not to take steps, like public disclosures, that might undermine them.
Over recent weeks, we have grappled with the extraordinary nature of this particular investigation through this lens. The questions that have arisen go to the integrity of US elections. And the limited information Congress and the intelligence community have shared with us to date suggests that efforts to compromise the 2016 election were varied and sophisticated — and that understanding those efforts requires a united effort, from across the technology, intelligence and political communities. We believe the public deserves a full accounting of what happened in the 2016 election, and we’ve concluded that sharing the ads we’ve discovered, in a manner that is consistent with our obligations to protect user information, can help.
That’s why we have reached out to congressional leadership to agree on a process and schedule to provide the content of these ads, along with related information, to congressional investigators. At the same time, we will continue our own review and investigation, and to do our part to make sure investigators have the information they need. We look forward to their comprehensive assessment, and to a greater public understanding of what took place.
See also: Hard Questions: More on Russian Ads
By Yann LeCun, Chief AI Scientist
At Facebook, we think artificial intelligence can play a big role in helping bring the world closer together. With that in mind, we’ve been investing in AI research and engineering for many years — and today we’re excited to announce an expansion of those efforts with the opening of a new AI research lab in Montreal.
As part of Facebook AI Research (FAIR), this new team will join more than 100 scientists across Menlo Park, New York, and Paris in working to advance the field of artificial intelligence. The Montreal lab will house research scientists and engineers working on a wide range of ambitious AI research projects, but it will also have a special focus on reinforcement learning and dialog systems.
We are excited the new lab will be led by renowned Professor Joelle Pineau, who co-directs the Reasoning and Learning Lab at McGill University. Dr. Pineau’s previous research has focused on developing new algorithms for planning and learning and then applying them to complex problems in robotics, health care, games, and conversational agents. Dr. Pineau will maintain her academic position at McGill University, in addition to building the FAIR Montreal team. We think the talent we can attract will bring valuable expertise and new perspectives to our work, and under Dr. Pineau’s leadership, we will continue to invest in this team and in the Canadian research community as a whole.
As we’ve done at other FAIR sites, FAIR Montreal will engage with the broader research community through publications, open source software, participation in technical conferences and workshops, and research collaborations. We are also launching new partnerships with the Canadian Institute for Advanced Research (CIFAR), the Montreal Institute for Learning Algorithms (MILA), McGill University, and Université de Montréal.
Montreal already has an existing fantastic academic AI community, an exciting ecosystem of startups, and promising government policies to encourage AI research. We are excited to become part of this larger community, and we look forward to engaging with the entire ecosystem and helping it continue to thrive.
Facebook equips businesses with powerful ways to reach the right people with the right message. But there are restrictions on how audience targeting can be used on Facebook. Hate speech and discriminatory advertising have no place on our platform. Our community standards strictly prohibit attacking people based on their protected characteristics, including religion, and we prohibit advertisers from discriminating against people based on religion and other attributes.
As people fill in their education or employer on their profile, we have found a small percentage of people who have entered offensive responses, in violation of our policies. ProPublica surfaced that these offensive education and employer fields were showing up in our ads interface as targetable audiences for campaigns. We immediately removed them. Given that the number of people in these segments was incredibly low, an extremely small number of people were targeted in these campaigns.
Keeping our community safe is critical to our mission. And to help ensure that targeting is not used for discriminatory purposes, we are removing these self-reported targeting fields until we have the right processes in place to help prevent this issue. We want Facebook to be a safe place for people and businesses, and we’ll continue to do everything we can to keep hate off Facebook.
Advertisers can report any inappropriate targeting fields directly in the ads interface or via our Help Center.
By Mike Nowak, Product Director, Social Good
Today, we’re announcing Crisis Response, a new center on Facebook where people can find more information about recent crises and access our crisis response tools – including Safety Check, Community Help and Fundraisers to support crisis recovery – all in one place. As part of this update, we are also introducing links to articles, videos and photos posted publicly by the Facebook community, to help people be more informed about a crisis.
Crisis Response on Facebook
We have developed a number of crisis response tools, based on what we’ve learned from our community. When there is a crisis, people use Facebook to let their friends and family know they’re safe, learn and share more about what’s happening, and help communities recover. People will be able to access Crisis Response on Facebook in the upcoming weeks from the homepage on desktop or from the menu button on their phone. They will see the following tools when they’re on a crisis page:
- Safety Check: an easy way to let your friends and family know you’re safe. It will continue to work the same way it does today and will be featured at the top of each crisis page if you are in the affected area.
- Links to Articles, Photos and Videos: crisis-related content from public posts can help people learn more about a crisis.
- Community Help: people can ask for and give help to communities affected by the crisis.
- Fundraisers: let people create fundraisers and donate to support those affected by the crisis and nonprofit organizations helping with relief efforts.
Adding More Crisis-Related Content
When people receive Safety Check notifications or learn that a crisis has happened, they may not know much about the incident and want to learn more. Starting today, we will begin to include links to articles, photos, and videos from public posts so people have access to more information about a crisis in one place. Safety Check activations and related information may also appear in News Feed to help provide additional details about a crisis.
We hope these updates continue to provide people with helpful information to keep them safe and help communities to rebuild and recover.
By Antigone Davis, Head of Global Safety
We’re recognizing World Suicide Prevention Day by letting people know about the tools and resources we have developed for people who may be at risk.
Throughout September, we’ll connect people with information about supportive groups and suicide prevention tools through ads in News Feed. We are also launching a new section of our Safety Center with additional resources about suicide prevention and online well being. People can access tools to resolve conflict online, help a friend who is expressing suicidal thoughts or get resources if they’re going through a difficult time. We’ve offered tools like these, developed in collaboration with mental health organizations, for more than ten years. It’s part of our ongoing effort to help build a safe community on and off Facebook.
Because of the relationships people have on Facebook, we are in a unique position to help connect those in distress with friends who can show support. Mental health experts say these connections can be helpful in preventing suicide, and we see it happen in a variety of ways.
People’s friends are in the best position to know when they’re struggling – and speed is critical – so they can reach out directly through things like comments on a post. As we recently shared, there are cases where the combination of technology — recognizing patterns in people’s comments on posts — and the compassion of people in our community can help prevent harm.
People can also reach out to Facebook when they see something that makes them concerned about a friend’s well-being. We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide.
For those who reach out to us, we provide suggested text to make it easier for people to start a conversation with their friend in need and also provide information and resources for how to best handle the situation. We provide the friend who has expressed suicidal thoughts information about local help lines, along with other tips and resources. Thanks to over 80 partners around the world, the resources people see are specific to where they are located.
We take other steps, such as working with suicide prevention partners to collect phrases, hashtags and group names associated with online challenges encouraging self-harm or suicide. We offer resources to people that search for these terms on Facebook. We also remove content that violates our Community Standards, which don’t allow the promotion of self-injury or suicide.
With the help of our partners and people’s friends and family on Facebook, we hope we can continue to support those in need.
By Alex Stamos, Chief Security Officer
There have been a lot of questions since the 2016 US election about Russian interference in the electoral process. In April we published a white paper that outlined our understanding of organized attempts to misuse our platform. One question that has emerged is whether there’s a connection between the Russian efforts and ads purchased on Facebook. These are serious claims and we’ve been reviewing a range of activity on our platform to help understand what happened.
In reviewing the ads buys, we have found approximately $100,000 in ad spending from June of 2015 to May of 2017 — associated with roughly 3,000 ads — that was connected to about 470 inauthentic accounts and Pages in violation of our policies. Our analysis suggests these accounts and Pages were affiliated with one another and likely operated out of Russia.
We don’t allow inauthentic accounts on Facebook, and as a result, we have since shut down the accounts and Pages we identified that were still active.
- The vast majority of ads run by these accounts didn’t specifically reference the US presidential election, voting or a particular candidate.
- Rather, the ads and accounts appeared to focus on amplifying divisive social and political messages across the ideological spectrum — touching on topics from LGBT matters to race issues to immigration to gun rights.
- About one-quarter of these ads were geographically targeted, and of those, more ran in 2015 than 2016.
- The behavior displayed by these accounts to amplify divisive messages was consistent with the techniques mentioned in the white paper we released in April about information operations.
In this latest review, we also looked for ads that might have originated in Russia — even those with very weak signals of a connection and not associated with any known organized effort. This was a broad search, including, for instance, ads bought from accounts with US IP addresses but with the language set to Russian — even though they didn’t necessarily violate any policy or law. In this part of our review, we found approximately $50,000 in potentially politically related ad spending on roughly 2,200 ads.
We have shared our findings with US authorities investigating these issues, and we will continue to work with them as necessary.
Authentic Activity Matters
We know we have to stay vigilant to keep ahead of people who try to misuse our platform. We believe in protecting the integrity of civic discourse, and require advertisers on our platform to follow both our policies and all applicable laws. We also care deeply about the authenticity of the connections people make on our platform.
Earlier this year, as part of this effort, we announced technology improvements for detecting fake accounts and a series of actions to reduce misinformation and false news. Over the past few months, we have taken action against fake accounts in France, Germany, and other countries, and we recently stated that we will no longer allow Pages that repeatedly share false news to advertise on Facebook.
Along with these actions, we are exploring several new improvements to our systems for keeping inauthentic accounts and activity off our platform. For example, we are looking at how we can apply the techniques we developed for detecting fake accounts to better detect inauthentic Pages and the ads they may run. We are also experimenting with changes to help us more efficiently detect and stop inauthentic accounts at the time they are being created.
Our ongoing work on these automated systems will complement other planned projects to help keep activity on Facebook authentic. We’re constantly updating our efforts in this area, and have introduced a number of improvements, including:
- applying machine learning to help limit spam and reduce the posts people see that link to low-quality web pages;
- adopting new ways to fight against disguising the true destination of an ad or post, or the real content of the destination page, in order to bypass Facebook’s review processes;
- reducing the influence of spammers and deprioritizing the links they share more frequently than regular sharers;
- reducing stories from sources that consistently post clickbait headlines that withhold and exaggerate information;
- and blocking Pages from advertising if they repeatedly share stories marked as false.
We will continue to invest in our people and technology to help provide a safe place for civic discourse and meaningful connections on Facebook.
By Satwik Shukla, Product Manager & Tessa Lyons, Product Manager
Over the past year we have taken several steps to reduce false news and hoaxes on Facebook. Currently, we do not allow advertisers to run ads that link to stories that have been marked false by third-party fact-checking organizations. Now we are taking an additional step. If Pages repeatedly share stories marked as false, these repeat offenders will no longer be allowed to advertise on Facebook.
This update will help to reduce the distribution of false news which will keep Pages that spread false news from making money. We’ve found instances of Pages using Facebook ads to build their audiences in order to distribute false news more broadly. Now, if a Page repeatedly shares stories that have been marked as false by third-party fact-checkers, they will no longer be able to buy ads on Facebook. If Pages stop sharing false news, they may be eligible to start running ads again.
False news is harmful to our community. It makes the world less informed and erodes trust. At Facebook, we’re working to fight the spread of false news in three key areas:
- Disrupting the economic incentives to create false news;
- Building new products to curb the spread of false news; and
- Helping people make more informed decisions when they encounter false news.
Today’s update helps to disrupt the economic incentives and curb the spread of false news, which is another step towards building a more informed community on Facebook.
By Oren Hod, Product Manager
People come to Facebook to experience, share and talk about some of the most important moments happening in their lives, communities and around the world. Many of these moments are reminiscing past memories and moments between friends.
Since launching On This Day more than two years ago, we’ve learned that there are many different types of memories and moments that people enjoy revisiting and celebrating, which is why we are excited to share that we’ve added two new ways for people to relive meaningful memories and celebrate special moments on Facebook.
Recapping Your Memories
We’ve launched a new experience that packages your recent memories in a delightful way for you to enjoy and share. For related recent memories, we will bundle them into a monthly or seasonal memory recap story. Like On This Day, these memory recap stories will show up in News Feed and are shareable.
Celebrate Your Friendships
We’re launching a new way to celebrate the actions that connect you and your community on Facebook. There are two types of moments where you may see these celebratory messages – when you make a notable number of friends on Facebook, and when your friends have liked your posts. We plan to launch more messages like this in the next few months. Additionally, these messages are currently only shown to you, but will become sharable in the near future.
Updates to On This Day and Memory Preferences
We’ve received input from people over the past two years and have worked to improve On This Day, such as making controls and preferences easier to access. On This Day is one of Facebook’s most popular experiences and we’re excited that this feature is now available to everyone on Facebook.
Finally, we know that occasionally there are some memories that may spark negative feelings that you would rather avoid. We’ve invested a lot in developing ways to filter content that will select photos we believe may be the most relevant and enjoyable to you.
We know how much people cherish their friendships and memories, which is why we approach these experiences with sensitivity and care. Our goal is to create a supportive environment that allows you to express your feelings and connect with what matters to you and your community.
By Monika Bickert, Director of Global Policy Management
In the days after my husband died, I kept sending him text messages. His cell phone lay uncharged on my nightstand, just a few feet away from me, and I knew no one would ever read the words I wrote, but I kept writing anyway. I needed to feel like I was still connected to him. As I sat in bed texting, I knew that my phone also held recent photos of Phil smiling with our daughters and a video of him laughing with his brother just two days before I took him to the hospital, but I didn’t look at those. It would have hurt too much. Instead, I just kept writing to him, pretending he was on the other side of the messages I was sending and would soon write back.
When we lose someone we love, we often feel a desperate need to connect to them in whatever way we can. In moments like that, our phones, the internet and social media can sometimes be a refuge. We can talk to our loved ones, as I did, or when we’re ready to face the memories, we can lose ourselves in old emails, photos, videos and posts. With an ease that wasn’t possible 20 years ago, we can now hear and see our loved ones after they are gone, and we can share those memories with others who are grieving.
But other times, the online world can make loss even more painful. The reminders of our loved ones are everywhere, and with each reminder a renewed realization of their death. For months after Phil died, I’d cry when I’d receive an Amazon email prompting him to order his regular shipment of secondhand detective novels, or a message from his pharmacy cheerfully reminding him that his chemotherapy was ready for pickup. Even now, I pause whenever I log into Facebook and see a post of mine resurfaced from years ago. I worry it will be one of the many I shared with friends over the course of Phil’s battle with cancer, detailing his progress and hinting at our naïve faith that he would continue to beat the odds.
Depending on the circumstances of a person’s death, those online reminders can be overwhelming. A mother who loses her daughter to domestic violence may feel sick when she looks online and sees photos of her daughter’s wedding day. A university student who receives a birthday reminder for a roommate who died by suicide might feel grief more acutely thinking of all the expressions of love and support his roommate would be receiving if he were around.
Our Approach at Facebook
When people come to Facebook after suffering a loss, we want them to feel comfort, not pain, which is why we stop sending birthday reminders once we know someone has passed away, and why we try to make it easy for surviving family members to reach us.
All too often, however, it’s difficult for us to know what action to take with the account of someone who has died. What should we do with an account of a deceased young woman, for instance, when one of her parents wants to delete the account but the other wants to preserve it as a memorial for friends and family? How do we know what the daughter would have wanted? And what should we do if they want to see the private messages between the daughter and her friends – friends who are still alive and don’t want their messages to become public?
These questions — how to weigh survivors’ competing interests, determine the wishes of the deceased, and protect the privacy of third parties – have been some of the toughest we’ve confronted, and we still don’t have all the answers. Laws may provide clarity, but often they do not. In many countries, the legal framework for transferring assets to surviving family members does not account for digital assets like social media or email accounts. We are, however, doing our part to try and make these situations easier for everyone.
Respect the Wishes of the Deceased
Where the law permits, we try to respect the wishes of those who have passed away. Sometimes, however, we simply don’t know what the person would have wanted. If a bereaved spouse asks us to add her as a friend to her late husband’s profile so she can see his photos and posts, how do we know if that’s what her husband would have wanted? Is there a reason they were not previously Facebook friends? Does it mean something if she had sent him a friend request when he was alive and he had rejected it? What if the wife had simply never been on Facebook until after her husband’s death?
If we don’t know what the deceased person would have wanted, we try to leave the account exactly as that person left it. When we learn that someone has passed away, our standard process is to add “Remembering” above the name on the person’s profile, to make clear that the account is now a memorial site, and to stop any new attempts to log into the account. Once we’ve memorialized an account, anything on the profile remains on Facebook and is visible to the people who could already see it before the profile was memorialized. We don’t remove or change anything. This is our way of respecting the choices someone made while alive.
Memorialization is our default action, but we know that some people might not want their account preserved this way. They might prefer that we delete their profile. Recognizing this, we give people a way to let us know they want their account permanently deleted when they die. We may also delete profiles when the next of kin tells us that the deceased loved one would have preferred that we delete the account rather than memorialize it.
Other people might want a friend or family member to be able to manage their profile as a memorial site after their death. That’s why in 2015, we created the option for people to choose a legacy contact. A legacy contact is a family member or friend who can manage certain features on your account if you pass away, such as changing your profile picture, accepting friend requests or adding a pinned post to the top of your profile. They can also elect to delete your account. You can give your legacy contact permission to download an archive of the photos, posts and profile information you shared on Facebook, but they won’t be able to log in as you or see your private messages. Find out more about legacy contacts and how to add one to your account in our Help Center.
Protect the Privacy of Survivors
Even where the laws are clear and the intent of the deceased person is clear, we sometimes have other interests to consider. For instance, if a father loses a teenaged son to suicide, the father might want to read the private messages of his son to understand what was happening in his son’s life. Had he been struggling in his university classes? Was he having problems with his boyfriend? As natural as it might seem to provide those messages to the father, we also have to consider that the people who exchanged messages with the son likely expected those messages would remain private.
Although cases like this are heartbreaking, we generally can’t turn over private messages on Facebook without affecting other people’s privacy. In a private conversation between two people, we assume that both people intended the messages to remain private. And even where it feels right to turn over private messages to family members, laws may prevent us from doing so. The Electronic Communications Privacy Act and Stored Communications Act, for instance, prevent us from relying upon family consent to disclose the contents of a person’s communications.
We’re Still Learning
Despite our efforts to respect the wishes of those who pass away and those who survive them, we still encounter difficult situations where we end up disappointing people.
And even when we know perfectly and can act consistently with the wishes of the deceased and their loved ones, we know our actions will be of limited comfort. As I’m learning from my own experience, grief doesn’t recede quickly or quietly. Nearly a year after Phil died, I still catch my breath when I look through old photos on my phone. Some of those photos, like the ones I took of Phil in the hospital when I mistakenly thought we’d be going home the next day, move me to tears.
But others, like the one of him standing proudly in our backyard with our daughters on Father’s Day, are starting to make me smile again. Those flashes of happiness, however brief, prove to me that reminders of our loved ones don’t have to be reminders of loss. And that, in turn, gives me hope that social media and the rest of our online world, rather than provoking pain, can ultimately ease our grief.
By Baraa Hamodi, Engineer, Zahir Bokhari, Engineer, and Yun Zhang, Engineer
As part of our ongoing efforts to fight clickbait and improve the integrity of information on Facebook, we are announcing today two updates that will limit the spread of stories in News Feed that feature either fake video play buttons embedded in their imagery or videos of only a static image.
People want to see accurate information on Facebook, and so do we. When people click on an image in their News Feed featuring a play button, they expect a video to start playing. Spammers often use fake play buttons to trick people into clicking links to low quality websites.
Similarly, these deceptive spammers also use static images disguised as videos to trick people into clicking on a low quality experience. To limit this, during the coming weeks we will begin demoting stories that feature fake video play buttons and static images disguised as videos in News Feed.
Authentic communication is one of our core News Feed values, and we know our community values it.
How Will This Impact My Page?
Publishers that rely on these intentionally deceptive practices should expect the distribution of those clickbait stories to markedly decrease. Most Pages won’t see significant changes to their distribution in News Feed. But, as always, publishers should refer to our publishing best practices.
By Mike Nowak, Product Director
People come to Facebook to send well-wishes and celebrate birthdays with friends. In fact, every day more than 45 million people give birthday wishes on Facebook, which is why it’s important to us to ensure you can celebrate the way you want to.
We’re excited to announce two new birthday experiences that we hope will make birthdays even more meaningful while you’re celebrating on Facebook.
Giving Back On Your Birthday
People often dedicate their birthday to support a cause, and we’ve seen people using Facebook to raise money for causes they care about. For those in the US, we’re now making it easier to do this by giving you the opportunity to create a fundraiser for your birthday directly on Facebook.
Two weeks before your birthday, you’ll see a message from Facebook in your News Feed giving you the option to create a fundraiser for your birthday. You can create a fundraiser for any of the 750,000 US nonprofits available for fundraising on Facebook. Your friends will receive a notification inviting them to support your cause in honor of your special day.
Wish Your Best Friends Happy Birthday With a Video
We wanted to make birthdays even more special by giving people the opportunity to share a birthday wish with a close friend on their special day, which is why we’ve introduced shareable birthday videos made especially for you and your close friends.
These videos will be shown to you on the day of a close friend’s birthday, and like our other personalized videos, we created these videos because we wanted to make the birthday experience on Facebook even more fun for the special relationships in your life.
Birthdays have always been a part of Facebook, and we hope to continue providing you with a variety of experiences that make celebrating on the platform fun and meaningful for you and your friends.
by Shali Nguyen, Product Design Manager and Ryan Freitas, Design Director
Every person’s News Feed is different and populated with a unique set of stories — from photos and videos to GIFs and links. And with so many types of stories available, each feed is more complex than ever. In order to make News Feed more conversational and easier to read and navigate, we’ll be making a few updates to its design over the coming weeks.
We’re always working to help people have more lively and expressive conversations on Facebook. More and more, comments have become the way to have conversations about a post with other people. We’ve updated our comment style and made it easier to see which comments are direct replies to another person.
We’re making updates to refresh the look and feel of News Feed, including:
- Increased color contrast so that typography is more legible
- Larger link previews so everything is easier to read
- Updated icons and Like, Comment, and Share buttons that are larger and easier to tap
- Circular profile pictures to show who’s posting or commenting
We wanted to improve how people navigate News Feed to create a more consistent experience. We’re making it easier to:
- See where a link will take you before clicking on it
- See whose post you’re commenting on, reacting to, or reading while you’re in the post
- Return to News Feed once you’ve finished reading via a more prominent back button
Will This Impact My Page?
These design updates should not affect Pages’ reach or referral traffic.
By Deborah Liu, VP, Marketplace
Today, we’re starting to roll out Marketplace to 17 countries across Europe (Austria, Belgium, the Czech Republic, Denmark, Finland, France, Germany, Hungary, Ireland, Italy, Luxembourg, Netherlands, Norway, Portugal, Spain, Sweden and Switzerland), giving more people a single destination on Facebook to discover, buy and sell goods in their local communities.
Marketplace has already expanded to six countries (Australia, Canada, Chile, Mexico, New Zealand and the UK). Whether you’re a new parent looking for baby clothes or a collector looking for a rare find, you can feel good about buying and selling on Marketplace because it’s easy to view the public profiles of buyers and sellers, your mutual friends, and how long they’ve been on Facebook.
Throughout our initial rollout we have focused on making it easy for people to connect, browse and discover products. In May, more than 18 million new items posted for sale in Marketplace in the US, and that number continues to grow.
Go to our Help Center for more tips on how to buy and sell in Marketplace.
By Rob Leathern, Product Management Director and Bobbie Chang, Software Engineer
We are always working to combat the spread of misinformation and the financially-motivated bad actors who create misleading experiences for people. Today we’re sharing additional steps we’ve taken to remove even more of them from Facebook, so that what people see after clicking an ad or post matches their expectations.
Some of the worst offenders use a technique known as “cloaking” to circumvent Facebook’s review processes and show content to people that violates Facebook’s Community Standards and Advertising Policies. Here, these bad actors disguise the true destination of an ad or post, or the real content of the destination page, in order to bypass Facebook’s review processes. For example, they will set up web pages so that when a Facebook reviewer clicks a link to check whether it’s consistent with our policies, they are taken to a different web page than when someone using the Facebook app clicks that same link. Cloaked destination pages, which frequently include diet pills, pornography and muscle building scams, create negative and disruptive experiences for people.
Since cloaking exists across many of today’s digital platforms, we will also be collaborating closely with other companies in the industry to find new ways to combat it and punish bad actors. Over the past few months we have been ramping up our enforcement across ads, posts and Pages, and have strengthened our policies to explicitly call out this practice. We will ban advertisers or Pages found to be cloaking from the platform.
How We Identify Cloaking
We are utilizing artificial intelligence and have expanded our human review processes to help us identify, capture, and verify cloaking. We can now better observe differences in the type of content served to people using our apps compared to our own internal systems.
In the past few months these new steps have resulted in us taking down thousands of these offenders and disrupting their economic incentives for misleading people.
How Will This Impact My Page?
We see cloaking as deliberate and deceptive, and will not tolerate it on Facebook. We will remove Pages that engage in cloaking. Otherwise Pages should not see changes to their referral traffic.
By Daniel Danker, Director of Product
Watching video on Facebook has the incredible power to connect people, spark conversation, and foster community. On Facebook, videos are discovered through friends and bring communities together. As more and more people enjoy this experience, we’ve learned that people like the serendipity of discovering videos in News Feed, but they also want a dedicated place they can go to watch videos. That’s why last year we launched the Video tab in the U.S., which offered a predictable place to find videos on Facebook. Now we want to make it even easier to catch up with shows you love.Introducing Watch
We’re introducing Watch, a new platform for shows on Facebook. Watch will be available on mobile, on desktop and laptop, and in our TV apps. Shows are made up of episodes — live or recorded — and follow a theme or storyline. To help you keep up with the shows you follow, Watch has a Watchlist so you never miss out on the latest episodes.
Watch is personalized to help you discover new shows, organized around what your friends and communities are watching. For example, you’ll find sections like “Most Talked About,” which highlights shows that spark conversation, “What’s Making People Laugh,” which includes shows where many people have used the “Haha” reaction, and “What Friends Are Watching,” which helps you connect with friends about shows they too are following.
We’ve learned from Facebook Live that people’s comments and reactions to a video are often as much a part of the experience as the video itself. So when you watch a show, you can see comments and connect with friends and other viewers while watching, or participate in a dedicated Facebook Group for the show.A Platform for Shows
Watch is a platform for all creators and publishers to find an audience, build a community of passionate fans, and earn money for their work. We think a wide variety of Facebook shows can be successful, particularly:
- Shows that engage fans and community. Nas Daily publishes a daily show where he makes videos together with his fans from around the world. The Watchlist makes it easy for fans to catch every day’s new episode.
- Live shows that connect directly with fans. Gabby Bernstein, a New York Times bestselling author, motivational speaker, and life coach, uses a combination of recorded and live episodes to connect with her fans and answer questions in real time.
- Shows that follow a narrative arc or have a consistent theme. Tastemade’s Kitchen Little is a funny show about kids who watch a how-to video of a recipe, then instruct professional chefs on how to make it. Each episode features a new child, a new chef, and a new recipe. Unsurprisingly, the food doesn’t always turn out as expected.
- Live events that bring communities together. Major League Baseball is broadcasting a game a week on Facebook, enabling people to watch live baseball while connecting with friends and fellow fans on the platform.
We think Watch will be home to a wide range of shows, from reality to comedy to live sports. To help inspire creators and seed the ecosystem, we’ve also funded some shows that are examples of community-oriented and episodic video series. For example, Returning the Favor is a series hosted by Mike Rowe where he finds people doing something extraordinary for their community, tells the world about it, and in turn does something extraordinary for them. Candidates are nominated by Mike’s fans on Facebook.
We’re excited to see how creators and publishers use shows to connect with their fans and community. You can learn more about making shows on our Media blog.
We’ll be introducing Watch to a limited group of people in the U.S. and plan to bring the experience to more people soon. Similarly, we’ll be opening up Shows to a limited group of creators and plan to roll out to all soon.
We are continuing in our commitment to provide regular updates on changes to or additions of metrics across Facebook ads and Pages. Today, we are sharing two new metrics updates: removal of unintentional clicks on ads in the Audience Network and new ad impression reporting.Removing unintentional clicks from Audience Network
When browsing across the web or in an app, ads may pop up in places that cause people to accidentally click on them. These interactions are often quick, as the person immediately closes out of the landing page to return to their original destination.
These ad experiences can be profitable in the short term for publishers, but they fail to deliver good experiences for businesses or people. For advertisers, these kinds of unintentional clicks can dilute the value of their campaigns.
To understand if a click is intentional, one of the metrics we look at in our delivery models and quality detection systems is “drop off rates”—the time a user spends on the landing page of an ad. We found that people who click on an Audience Network ad and spend less than 2 seconds on a destination page almost always clicked accidentally. Moving forward, we will no longer count clicks categorized as unintentional in advertiser’s campaigns.
We’re also clarifying our policies around ad placements that better service people and businesses. You can learn more about those changes here.
Moving forward, we will begin experimenting with more ways to reduce the number of unintentional clicks by looking further into additional bounce rate metrics, and trying to prevent users from accidentally clicking in the first place.
Above is an example of an ad placement that may lead to an unintentional clickBetter view of total campaign impressions
Advertisers tell us they want more simple insights into how their ads are delivered. To give businesses a better view of the total impressions ads receive, we are providing two new metrics to help offer more clarity on the number of ads shown to people.
Gross impressions capture all impressions, billable and non-billable. Impressions aren’t billed if they were delivered after an advertiser’s budget was spent, are served to the same person within a short time, or are due to detectable fraud. Gross impressions gives marketers the opportunity to quantify non-billable impressions.
Auto-refresh impressions provide a granular look at impressions generated from right-hand side placements. Ads placed on the right-hand side of the desktop News Feed are automatically refreshed with a new set of ads after a period of time. Auto-refresh impressions will show you how many impressions on your right-hand side ad are a result of a browser refresh.
These updates offer more transparency into ad delivery and help ensure that you pay for valuable impressions. As always, keep checking back for further metrics updates in the coming months.
More About Our People, Programs and Progress in 2017
By Maxine Williams, Global Director of Diversity
With a global community of over 2 billion people on Facebook, the case for a more diverse and inclusive company is clear. Diversity helps us build better products, make better decisions and better serve our community.
We aren’t where we’d like to be, but we’re encouraged that over the past year, representation for people from underrepresented groups at Facebook has increased. This year, the number of women globally has risen from 33% to 35% and the number of women in tech has increased from 17% to 19%. Women now make up 27% of all new graduate hires in engineering and 21% of all new technical hires at Facebook. In the US, we have increased the representation of Hispanics from 4% to 5%, and Black people from 2% to 3%.
We are proud of the contributions of all of our people. Product Design Director Dantley Davis’ team is focused on building AR capabilities for the Facebook Camera. Delfina Eberly, VP of Infrastructure, Site Operations, runs our cutting-edge data center infrastructure. Community Operations Director James Mitchell’s team helps keep people safe on Instagram. We are already seeing a tangible impact from a more diverse Facebook – and we want to continue to find, grow, and keep the best talent.
We’re committed to building a more diverse, inclusive Facebook – and will remain committed. Much like our approach to launching new products on our platform, we are willing to experiment and listen to feedback. We want to highlight three programs in particular:
- Diverse Slate Approach: The more people you interview who don’t look or think like you, the more likely you are to hire someone from a diverse background. To hard wire this behavior at Facebook, we introduced our Diverse Slate Approach (DSA) in 2015 and have since rolled it out globally. DSA sets the expectation that hiring managers will consider candidates from underrepresented backgrounds when interviewing for an open position.
- Managing Unconscious Bias: Our publicly available Managing Unconscious Bias class encourages our people to challenge and correct bias as soon as they see it – in others, and in themselves. We’ve also doubled down by adding two new internal programs: Managing Inclusion, which trains managers to understand the issues that affect marginalized communities, and Be The Ally, which gives everyone the common language, tools and space to practice supporting others.
- Facebook University: We want to increase access and opportunity for students with an interest in software engineering, business and analytics. Facebook University gives underrepresented students extra training and mentorship earlier in their college education. We started FBU in 2013 with 30 students, and over 500 students have since graduated from the program, with many returning to Facebook for internships and full-time jobs.
You can see our latest employment data, read more about the impact of our people, and review some of our short, medium and long-term efforts in detail here.
By Jiayi Wen, Engineer, and Shengbo Guo, Engineer
We’re always listening to our community to understand how we can improve their experience of News Feed. We’ve heard from people that it’s frustrating to click on a link that leads to a slow-loading webpage. In fact, even more broadly on the internet, we’ve found that when people have to wait for a site to load for too long, they abandon what they were clicking on all together. As many as 40 percent of website visitors abandon a site after three seconds of delay.
During the coming months we’re making an update to News Feed to show people more stories that will load quickly on mobile and fewer stories that might take longer to load, so they can spend more time reading the stories they find relevant.
Taking loading time into account
With this update, we’ll soon take into account the estimated load time of a webpage that someone clicks to from any link in News Feed on the mobile app. Factors such as the person’s current network connection and the general speed of the corresponding webpage will be considered. If signals indicate the webpage will load quickly, the link to that webpage might appear higher in your feed.
For years, we have taken many factors into account to make sure people quickly see relevant stories to them — including the type of device you’re on or the speed of your mobile network or wifi connection. For example, if you are on a slower internet connection that won’t load videos, News Feed will show you fewer videos and more status updates and links. And to help load stories faster for people on slow or poor network connections, we prefetch stories by downloading mobile content before someone clicks a link, which we’ve seen can shorten load time for webpages by more than 25%.
Will This Impact my Page?
This update will roll out gradually over the coming months. We anticipate that most Pages won’t see any significant changes to their distribution in News Feed. Webpages that are particularly slow could see decreases in referral traffic. To help webpages avoid experiencing potential decreases, we’re sharing tips to help site owners make their site faster and more mobile-friendly. See here for publisher best practices for improving mobile site load time.
As always, publishers should keep in mind these basic guideposts to reach their audience on Facebook and continue to post stories that are relevant to their audiences and that their readers find informative.