You are here

Redes Sociales

Our First Communities Summit and New Tools For Group Admins

Facebook - Jue, 06/22/2017 - 18:12

By Kang-Xing Jin, VP, Engineering

Today we hosted our first-ever Facebook Communities Summit in Chicago with hundreds of group admins where we announced new features to support their communities on Facebook.

Mark Zuckerberg kicked off by celebrating the role Groups play in the Facebook community and thanking the group admins who lead them. He also announced a new mission for Facebook that will guide our work over the next decade: Give people the power to build community and bring the world closer together.

An important part of delivering on our new mission is supporting group admins, who are real community leaders on Facebook. We’re adding several new features to help them grow and manage their groups:

  • Group Insights: group admins have told us consistently that having a better understanding of what’s going on in their groups would help them make decisions on how to best support their members. Now, with Group Insights, they’ll be able to see real-time metrics around growth, engagement and membership — such as the number of posts and times that members are most engaged.
  • Membership request filtering: we also hear from admins that admitting new members is one of the most time-consuming things they do. So, we added a way for them to sort and filter membership requests on common categories like gender and location, and then accept or decline all at once.
  • Removed member clean-up: to help keep their communities safe from bad actors, group admins can now remove a person and the content they’ve created within the group, including posts, comments and other people added to the group, in one step.
  • Scheduled posts: group admins and moderators can create and conveniently schedule posts on a specific day and time.
  • Group to group linking: we’re beginning to test group-to-group linking, which allows group admins to recommend similar or related groups to their members. This is just the beginning of ways that we’re helping bring communities and sub-communities closer together.

More than 1 billion people around the world use Groups, and more than 100 million people are members of “meaningful groups.” These are groups that quickly become the most important part of someone’s experience on Facebook. Today we’re setting a goal to help 1 billion people join meaningful communities like these.

In Chicago, we celebrated some of these groups built around local neighborhoods, shared passions and life experiences. For example, some of the groups and admins that attended include:

  • Terri Hendricks, who started Lady Bikers of California so that women who ride motorcycles could connect with each other, meet in real life through group rides, and offer each other both motorcycle-related and personal support. Terri says that when she started riding motorcycles it was rare to see other women who rode and that across the group, there is “nothing that these ladies wouldn’t do for each other.”
  • Matthew Mendoza, who started Affected by Addiction Support Group. The group is a safe space for people who are experiencing or recovering from drug and alcohol addiction, as well as their friends and family, to offer support and share stories.
  • Kenneth Goodwin, minister of Bethel Church in Decatur, Georgia, who uses the Bethel Original Free Will Baptist Church group to post announcements to the local community about everything happening at Bethel. He and the other admins will often share information about events, meeting times for their small group ministries, and live videos of sermons so people who cannot attend can watch from their homes.

We’re inspired by these stories and the hundreds of others we’ve heard from people attending today’s event. We’re planning more events to bring together group admins outside the US and look forward to sharing more details soon.

Categorías: Redes Sociales

Giving People More Control Over Their Facebook Profile Picture

Facebook - Jue, 06/22/2017 - 04:00

By Aarati Soman, Product Manager

Part of our goal in building global community is understanding the needs of people who use Facebook in specific countries and how we can better serve them. In India, we’ve heard that people want more control over their profile pictures, and we’ve been working over the past year to understand how we can help.

Today, we are piloting new tools that give people in India more control over who can download and share their profile pictures. In addition, we’re exploring ways people can more easily add designs to profile pictures, which our research has shown helpful in deterring misuse. Based on what we learn from our experience in India, we hope to expand to other countries soon.

Profile pictures are an important part of building community on Facebook because they help people find friends and create meaningful connections. But not everyone feels safe adding a profile picture. In our research with people and safety organizations in India, we’ve heard that some women choose not to share profile pictures that include their faces anywhere on the internet because they’re concerned about what may happen to their photos.

These tools, developed in partnership with Indian safety organizations like Centre for Social Research, Learning Links Foundation, Breakthrough and Youth Ki Awaaz, are designed to give people more control over their experience and help keep them safe online.

New Controls

People in India will start seeing a step-by-step guide to add an optional profile picture guard. When you add this guard:

  • Other people will no longer be able to download, share or send your profile picture in a message on Facebook
  • People you’re not friends with on Facebook won’t be able to tag anyone, including themselves, in your profile picture
  • Where possible, we’ll prevent others from taking a screenshot of your profile picture on Facebook, which is currently available only on Android devices
  • We’ll display a blue border and shield around your profile picture as a visual cue of protection

Deterring Misuse

Based on preliminary tests, we’ve learned that when someone adds an extra design layer to their profile picture, other people are at least 75% less likely to copy that picture.

We partnered with Jessica Singh, an illustrator who took inspiration from traditional Indian textile designs such as bandhani and kantha, to create designs for people to add to their profile picture.

If someone suspects that a picture marked with one of these designs is being misused, they can report it to Facebook and we will use the design to help determine whether it should be removed from our community.

Categorías: Redes Sociales

Hard Questions: How We Counter Terrorism

Facebook - Jue, 06/15/2017 - 19:00

By Monika Bickert, Director of Global Policy Management, and Brian Fishman, Counterterrorism Policy Manager

In the wake of recent terror attacks, people have questioned the role of tech companies in fighting terrorism online. We want to answer those questions head on. We agree with those who say that social media should not be a place where terrorists have a voice. We want to be very clear how seriously we take this — keeping our community safe on Facebook is critical to our mission.

In this post, we’ll walk through some of our behind-the-scenes work, including how we use artificial intelligence to keep terrorist content off Facebook, something we have not talked about publicly before. We will also discuss the people who work on counterterrorism, some of whom have spent their entire careers combating terrorism, and the ways we collaborate with partners outside our company.

Our stance is simple: There’s no place on Facebook for terrorism. We remove terrorists and posts that support terrorism whenever we become aware of them. When we receive reports of potential terrorism posts, we review those reports urgently and with scrutiny. And in the rare cases when we uncover evidence of imminent harm, we promptly inform authorities. Although academic research finds that the radicalization of members of groups like ISIS and Al Qaeda primarily occurs offline, we know that the internet does play a role — and we don’t want Facebook to be used for any terrorist activity whatsoever.

We believe technology, and Facebook, can be part of the solution.

We’ve been cautious, in part because we don’t want to suggest there is any easy technical fix. It is an enormous challenge to keep people safe on a platform used by nearly 2 billion every month, posting and commenting in more than 80 languages in every corner of the globe. And there is much more for us to do. But we do want to share what we are working on and hear your feedback so we can do better.

Artificial Intelligence

We want to find terrorist content immediately, before people in our community have seen it. Already, the majority of accounts we remove for terrorism we find ourselves. But we know we can do better at using technology — and specifically artificial intelligence — to stop the spread of terrorist content on Facebook. Although our use of AI against terrorism is fairly recent, it’s already changing the ways we keep potential terrorist propaganda and accounts off Facebook. We are currently focusing our most cutting edge techniques to combat terrorist content about ISIS, Al Qaeda and their affiliates, and we expect to expand to other terrorist organizations in due course. We are constantly updating our technical solutions, but here are some of our current efforts.

  • Image matching: When someone tries to upload a terrorist photo or video, our systems look for whether the image matches a known terrorism photo or video. This means that if we previously removed a propaganda video from ISIS, we can work to prevent other accounts from uploading the same video to our site. In many cases, this means that terrorist content intended for upload to Facebook simply never reaches the platform.
  • Language understanding: We have also recently started to experiment with using AI to understand text that might be advocating for terrorism. We’re currently experimenting with analyzing text that we’ve already removed for praising or supporting terrorist organizations such as ISIS and Al Qaeda so we can develop text-based signals that such content may be terrorist propaganda. That analysis goes into an algorithm that is in the early stages of learning how to detect similar posts. The machine learning algorithms work on a feedback loop and get better over time.
  • Removing terrorist clusters: We know from studies of terrorists that they tend to radicalize and operate in clusters. This offline trend is reflected online as well. So when we identify Pages, groups, posts or profiles as supporting terrorism, we also use algorithms to “fan out” to try to identify related material that may also support terrorism. We use signals like whether an account is friends with a high number of accounts that have been disabled for terrorism, or whether an account shares the same attributes as a disabled account.
  • Recidivism: We’ve also gotten much faster at detecting new fake accounts created by repeat offenders. Through this work, we’ve been able to dramatically reduce the time period that terrorist recidivist accounts are on Facebook. This work is never finished because it is adversarial, and the terrorists are continuously evolving their methods too. We’re constantly identifying new ways that terrorist actors try to circumvent our systems — and we update our tactics accordingly.
  • Cross-platform collaboration: Because we don’t want terrorists to have a place anywhere in the family of Facebook apps, we have begun work on systems to enable us to take action against terrorist accounts across all our platforms, including WhatsApp and Instagram. Given the limited data some of our apps collect as part of their service, the ability to share data across the whole family is indispensable to our efforts to keep all our platforms safe.

Human Expertise

AI can’t catch everything. Figuring out what supports terrorism and what does not isn’t always straightforward, and algorithms are not yet as good as people when it comes to understanding this kind of context. A photo of an armed man waving an ISIS flag might be propaganda or recruiting material, but could be an image in a news story. Some of the most effective criticisms of brutal groups like ISIS utilize the group’s own propaganda against it. To understand more nuanced cases, we need human expertise.

  • Reports and reviews: Our community — that’s the people on Facebook — helps us by reporting accounts or content that may violate our policies — including the small fraction that may be related to terrorism. Our Community Operations teams around the world — which we are growing by 3,000 people over the next year — work 24 hours a day and in dozens of languages to review these reports and determine the context. This can be incredibly difficult work, and we support these reviewers with onsite counseling and resiliency training.
  • Terrorism and safety specialists: In the past year we’ve also significantly grown our team of counterterrorism specialists. At Facebook, more than 150 people are exclusively or primarily focused on countering terrorism as their core responsibility. This includes academic experts on counterterrorism, former prosecutors, former law enforcement agents and analysts, and engineers. Within this specialist team alone, we speak nearly 30 languages.
  • Real-world threats: We increasingly use AI to identify and remove terrorist content, but computers are not very good at identifying what constitutes a credible threat that merits escalation to law enforcement. We also have a global team that responds within minutes to emergency requests from law enforcement.

Partnering with Others

Working to keep terrorism off Facebook isn’t enough because terrorists can jump from platform to platform. That’s why partnerships with others — including other companies, civil society, researchers and governments — are so crucial.

  • Industry cooperation: In order to more quickly identify and slow the spread of terrorist content online, we joined with Microsoft, Twitter and YouTube six months ago to announce a shared industry database of “hashes” — unique digital fingerprints for photos and videos — for content produced by or in support of terrorist organizations. This collaboration has already proved fruitful, and we hope to add more partners in the future. We are grateful to our partner companies for helping keep Facebook a safe place.
  • Governments: Governments and inter-governmental agencies also have a key role to play in convening and providing expertise that is impossible for companies to develop independently. We have learned much through briefings from agencies in different countries about ISIS and Al Qaeda propaganda mechanisms. We have also participated in and benefited from efforts to support industry collaboration by organizations such as the EU Internet Forum, the Global Coalition Against Daesh, and the UK Home Office.
  • Encryption. We know that terrorists sometimes use encrypted messaging to communicate. Encryption technology has many legitimate uses – from protecting our online banking to keeping our photos safe. It’s also essential for journalists, NGO workers, human rights campaigners and others who need to know their messages will remain secure. Because of the way end-to-end encryption works, we can’t read the contents of individual encrypted messages — but we do provide the information we can in response to valid law enforcement requests, consistent with applicable law and our policies.
  • Counterspeech training: We also believe challenging extremist narratives online is a valuable part of the response to real world extremism. Counterspeech comes in many forms, but at its core these are efforts to prevent people from pursuing a hate-filled, violent life or convincing them to abandon such a life. But counterspeech is only effective if it comes from credible speakers. So we’ve partnered with NGOs and community groups to empower the voices that matter most.
  • Partner programs: We support several major counterspeech programs. For example, last year we worked with the Institute for Strategic Dialogue to launch the Online Civil Courage Initiative, a project that has engaged with more than 100 anti-hate and anti-extremism organizations across Europe. We’ve also worked with Affinis Labs to host hackathons in places like Manila, Dhaka and Jakarta, where community leaders joined forces with tech entrepreneurs to develop innovative solutions to push back against extremism and hate online. And finally, the program we’ve supported with the widest global reach is a student competition organized through the P2P: Facebook Global Digital Challenge. In less than two years, P2P has reached more than 56 million people worldwide through more than 500 anti-hate and extremism campaigns created by more than 5,500 university students in 68 countries.

Our Commitment

We want Facebook to be a hostile place for terrorists. The challenge for online communities is the same as it is for real world communities – to get better at spotting the early signals before it’s too late. We are absolutely committed to keeping terrorism off our platform, and we’ll continue to share more about this work as it develops in the future.

Read more about our new blog series Hard Questions. We want your input on what other topics we should address — and what we could be doing better. Please send suggestions to hardquestions@fb.com.

Categorías: Redes Sociales

Hard Questions

Facebook - Jue, 06/15/2017 - 14:00

By Elliot Schrage, Vice President for Public Policy and Communications

Today we’re starting something new.

Facebook is where people post pictures with their friends, get their news, form support groups and hold politicians to account. What started out as a way for college students in the United States to stay in touch is now used by nearly 2 billion people around the world. The decisions we make at Facebook affect the way people find out about the world and communicate with their loved ones.

It goes far beyond us. As more and more of our lives extend online, and digital technologies transform how we live, we all face challenging new questions — everything from how best to safeguard personal privacy online to the meaning of free expression to the future of journalism worldwide.

We debate these questions fiercely and freely inside Facebook every day — and with experts from around the world whom we consult for guidance. We take seriously our responsibility — and accountability — for our impact and influence.

We want to broaden that conversation. So today, we’re starting a new effort to talk more openly about some complex subjects. We hope this will be a place not only to explain some of our choices but also explore hard questions, such as:

  • How should platforms approach keeping terrorists from spreading propaganda online?
  • After a person dies, what should happen to their online identity?
  • How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?
  • Who gets to define what’s false news — and what’s simply controversial political speech?
  • Is social media good for democracy?
  • How can we use data for everyone’s benefit, without undermining people’s trust?
  • How should young internet users be introduced to new ways to express themselves in a safe environment?

As we proceed, we certainly don’t expect everyone to agree with all the choices we make. We don’t always agree internally. We’re also learning over time, and sometimes we get it wrong. But even when you’re skeptical of our choices, we hope these posts give a better sense of how we approach them — and how seriously we take them. And we believe that by becoming more open and accountable, we should be able to make fewer mistakes, and correct them faster.

Our first substantive post, later today, will be about responding to the spread of terrorism online — including the ways we’re working with others and using new technology.

We want your input on what other topics we should address — and what we could be doing better. Please send suggestions to hardquestions@fb.com.

Categorías: Redes Sociales

Celebrating 30 Years of the GIF

Facebook - Jue, 06/15/2017 - 05:59

On June 15, we’re celebrating the 30th anniversary of the GIF, which has made communicating on the internet more joyful, more visual and let’s face it, a whole lot funnier! To mark the big 3-0, we’re:

  • Taking an inside look at GIF popularity on Messenger
  • Announcing that GIFs in comments are now available to everyone on Facebook (yay!)
  • Introducing some new and exclusive GIFs we’ve created featuring some of the internet’s biggest stars
  • Asking you to help us answer the age-old debate of how to pronounce the word “GIF”

An Inside Look at GIFs in Messenger

With this milestone approaching, we took a look at how GIFs have transformed the way people communicate with each other since introducing GIFs in Messenger in 2015:

  • People on Messenger sent nearly 13 billion GIFs in the last year, or nearly 25,000 GIFs every minute
  • GIF sends on Messenger have tripled in the past year
  • New Year’s Day 2017 was the most popular day ever for GIF sends on Messenger, with more than 400 million GIF sends

GIFs in Facebook Comments are Finally Here!

We know people love communicating with GIFs on Messenger, and we’re also making it easier to use GIFs on Facebook. Today we’re introducing the ability to add GIFs in comments for all people on Facebook globally.

Just tap the GIF button when you go to make a comment, type in what you’re looking to say, and add the GIF that really nails it!

The GIF Party

We’re also celebrating the 30th anniversary the best way we know how — a GIF party with some of your favorite stars.

GIPHY Studios created 20 GIFs featuring some of the internet’s most recognizable faces: DNCE, Logan Paul, Amanda Cerny, DREEZY, Patrick Starr, Violet Benson, Wuz Good, Brandi Marie, and Landon Moss.

Each GIF is a unique and shareable morsel of human expression. They will be available to use by searching #GIFparty when sharing a GIF on Facebook or Messenger or by visiting GIPHY.com/Facebook.

Logan Paul

Violet Benson

Amanda Cerny

Landon Moss

Ending an Age-old Debate: How Do You Pronounce GIF?

Finally, we’re looking to solve the debate over how the word GIF is pronounced once and for all. Over the next few days, if you live in the US you might see a poll on Facebook asking you to cast your vote. You can also vote by visiting Facebook’s official Page on your mobile phone. To find the Page, search for “Facebook” in the main Facebook app.

We’ll report back here on whether the “hard g” or “soft g” pronunciation reigns supreme.

Categorías: Redes Sociales

Announcing Updates to Safety Check

Facebook - Mié, 06/14/2017 - 15:00

By Naomi Gleit, VP Social Good

As part of our ongoing commitment to build a safe community, today we’re announcing several updates to Safety Check:

  • Introducing Fundraisers in Safety Check: people in the US will have the option to start a fundraiser from within Safety Check
  • Expanding Community Help: Community Help will be available on desktop and for all crisis types where Safety Check is activated
  • Adding more context with a personal note: now people can share a personal note in their Safety Check News Feed story with friends and loved ones
  • Introducing crisis descriptions: get more information about a crisis from NC4, our trusted third party global crisis reporting agency, within the Safety Check tool

Introducing Fundraisers in Safety Check
Following a crisis, one way people give and request help is through fundraising. To make this easier, we are introducing Fundraisers in Safety Check. Within Safety Check, people will be able to create or donate to a fundraiser for charitable and personal causes to help those in need. Fundraising provides a way for people who are also outside of the crisis area to offer help. Fundraisers in Safety Check will start to roll out in the coming weeks in the US.

Expanding Community Help
Since we launched Community Help earlier this year on iOS and Android, we have been inspired by the offers and requests for help generated by the community and want to make sure that those in need are able to access Community Help through any platform. Community Help will be available in the upcoming weeks on desktop, giving people another way to access the tool. Additionally, Community Help is now available for all crises where Safety Check is activated.

Adding more context with a personal note
After marking themselves safe, people share additional information to help reassure friends they are safe and to provide more context about the crisis. To make this easier, people can now add a personal note to tell their friends more about what’s happening from within the Safety Check tool. This note will appear in the News Feed story that is automatically generated when people mark themselves safe.

Introducing crisis descriptions
When people receive Safety Check notifications, they may have limited information about the crisis. To help provide additional context on crises and make sure people have the information that they need, we have started adding descriptions about the crisis from NC4, our trusted third party global crisis reporting agency.

Safety Check has been activated more than 600 times in two years and has notified people that their families and friends are safe more than a billion times. Keeping the community safe means everything to us at Facebook and we hope that these updates to Safety Check continue to do just that.

Categorías: Redes Sociales

Using Data to Help Communities Recover and Rebuild

Facebook - Mié, 06/07/2017 - 18:00

By Molly Jackman, Public Policy Research Manager

After a flood, fire, earthquake or other natural disaster, response organizations need accurate information, and every minute counts in saving lives. Traditional communication channels are often offline and it can take significant time and resources to understand where help is desperately needed.

Facebook can help response organizations paint a more complete picture of where affected people are located so they can determine where resources — like food, water and medical supplies — are needed and where people are out of harm’s way.

Today, we are introducing disaster maps that use aggregated, de-identified Facebook data to help organizations address the critical gap in information they often face when responding to natural disasters. Many of these organizations worked with us to identify what data would be most helpful and how it could be put to action in the moments following a disaster.

This initiative is the product of close work with UNICEF, the International Federation of the Red Cross and Red Crescent Societies, the World Food Programme, and other organizations. It is an example of how technology can help keep people safe, one of our five areas of focus as we help build a global community.

Based on these organizations’ feedback we are providing multiple types of maps during disaster response efforts, which will include aggregated location information people have chosen to share with Facebook.

Location density maps show where people are located before, during and after a disaster. We can compare this information to historical records, like population estimates based on satellite images. Comparing these data sets can help response organizations understand areas impacted by a natural disaster.

Movement maps illustrate patterns of movement between different neighborhoods or cities over a period of several hours. By understanding these patterns, response organizations can better predict where resources will be needed, gain insight into patterns of evacuation, or predict where traffic will be most congested.

Safety Check maps are based on where our community uses Safety Check to notify their friends and family that they are safe during a disaster. We are using this de-identified data in aggregate to show where more or fewer people check in safe, which may help organizations understand where people are most vulnerable and where help is needed.

This type of information can help response organizations understand which neighborhoods suffered the most damage following an earthquake and where people might be in need of help as they evacuate their homes and eventually return.

We are sharing this information with trusted organizations that have capacity to act on the data and respect our privacy standards, starting with UNICEF, the International Federation of the Red Cross and Red Crescent Societies, and the World Food Programme. We are working with these organizations to establish formal processes for responsibly sharing the datasets with others.

Over time, we intend to make it possible for additional organizations and governments to participate in this program. All applications will be reviewed carefully by people at Facebook, including those with local expertise.

We believe that our platform is a valuable source of information that can help response organizations serve people more efficiently and effectively. Ultimately, we hope this data helps communities have the information they need to recover and rebuild if disaster strikes.

Categorías: Redes Sociales

Making Facebook Live More Accessible With Closed Captions

Facebook - Mar, 06/06/2017 - 18:45

By Supratik Lahiri, Product Manager, and Jeffrey Wieland, Director of Accessibility

Making Facebook accessible to everyone is a key part of building global community. Today we’re allowing publishers to include closed captions in Facebook Live, helping people who are deaf or hard of hearing to experience live videos. Now, if your captioning settings are turned on, you’ll automatically see closed captions on Live broadcasts when they’re available.

Over the past year, daily watch time for Facebook Live broadcasts has grown by more than 4x, and 1 in 5 Facebook videos is a Live broadcast. By enabling publishers to include closed captions with their Live broadcasts, we hope more people can now participate in the exciting moments that unfold on Live.

Today’s milestone represents the next step in our efforts to make content on Facebook accessible to more people. It’s already possible to add captions to non-live videos when uploading them to Facebook Pages, and publishers can use our speech recognition service to automatically generate captions for videos on their Pages.

For more information on adding closed captions to Facebook Live broadcasts, click here. For more information on Facebook’s accessibility features and settings, click here, and follow news and updates from the Facebook Accessibility team here.

Categorías: Redes Sociales

Facebook Celebrates Pride Month

Facebook - Lun, 06/05/2017 - 15:00

By Alex Schultz, VP & Executive Sponsor of pride@facebook

As Pride celebrations begin around the world, Facebook is proud to support our diverse community, including those that have identified themselves on Facebook as gay, lesbian, bi-sexual, transgender or gender non-conforming. In fact, this year, over 12 million people across the globe are part of one of the 76,000 Facebook Groups in support of the LGBTQ community, and more than 1.5 million people plan to participate in one of the more than 7,500 Pride events on Facebook.

This year, we’re excited to unveil more ways than ever before for people to show their pride and support for the LGBTQ community on Facebook:

Update Your Profile Pic with a Rainbow Frame
Throughout the month of June, you might see a message from Facebook in your News Feed wishing you a Happy Pride and inviting you to add a colorful, Pride-themed profile frame. Additionally, you might also see a special animation on top of your News Feed if you happen to react to our message.

React with Pride
You may see a colorful, limited-edition Pride Reaction during Pride Month. When you choose this temporary rainbow reaction, you’ll be expressing your “Pride” to the post.


Brighten Up Your Photos
In Facebook Camera, you can find some new colorful, Pride-themed masks and frames. If you swipe to the left of News Feed, click on the magic wand to bring up camera effects and you’ll be able to find the effects in the mask and frame category.


Support an LGBTQ Cause
In the US, start a Facebook Fundraiser or donate to your favorite LGBTQ cause. On Facebook, you can raise money for a nonprofit or people — for yourself, a friend or someone or something not on Facebook.

Facebook isn’t the only place to celebrate the cause. All across our entire family of apps, you will have the opportunity to show your support:

Join the #KindComments Movement on Instagram
The photo sharing app is committed to fostering a safer and kinder community, and this June will be turning walls in major US cities into colorful beacons of LGBTQ support where you can leave supportive comments on your posts. You can also celebrate Pride and be creative with stickers and a rainbow brush.


Frame Up with Pride on Messenger
During Pride month, you can add some love to your conversations with friends and family with Pride-themed stickers, frames, and effects in the Messenger Camera.

Our Commitment and Participation
Facebook has long been a supporter of LGBTQ rights, through our products, policies and benefits to our employees. Not only will we be a part of Pride activities in more than 20 cities around the world, including in San Francisco, where we first marched in 2011, but we will also celebrate with our employees by hosting events and discussions, as well as by draping the Facebook monument outside the Menlo Park headquarters in the rainbow flag, as the company has done each year since 2012.

Our commitment and support of the LGBTQ community has been unwavering. From our support of marriage equality and bullying prevention, to the many product experiences that we’ve brought to life, we are proud of our attention to the LGBTQ experience on Facebook, often thanks to the many LGBTQ people and allies who work here.

Last year, for the first time ever, we began publicly sharing self-reported data around our LGBTQ community at Facebook. In a recent, voluntary survey of our employees in the US about sexual orientation and gender identity, to which 67% responded, 7% self-identified as being lesbian, gay, bisexual, queer, transgender or asexual. We are proud to support the LGBTQ community, and while more work still remains, we are eager to be active partners going forward.

Happy Pride!

Categorías: Redes Sociales

Update on Trending

Facebook - Mié, 05/24/2017 - 19:00

By Ali Ahmadi, Product Manager, and John Angelo, Product Designer

Redesigned Trending Results Page

Starting today, we’re introducing a redesigned Trending results page, which is the page you see when you click on a Trending topic to learn more about it.

You’ve always been able to click on a topic to see related posts and stories, but we’ve redesigned the page to make it easier to discover other publications that are covering the story, as well as what your friends and public figures are saying about it.

You’ll be able to see the new results page on iPhone in the US, and we plan to make it available on Android and desktop soon.

Now, when you click on a Trending topic, you’ll see a carousel with stories from other publications about a given topic that you can swipe through. By making it easier to see what other news outlets are saying about each topic, we hope that people will feel more informed about the news in their region.

The stories that appear in this section are some of the most popular stories about that topic on Facebook. These stories are determined the same way as the featured headline — using a combination of factors including the engagement around the article on Facebook, the engagement around the publisher overall, and whether other articles are linking to it.

There is no predetermined list of publications that are eligible to appear in Trending and this update does not affect how Trending topics are identified, which we announced earlier this year.

Making Trending Easier to Discover On Mobile

One of the things we regularly hear from people who use Trending is that it can be difficult to find in the Facebook mobile app. We’re soon beginning a test in News Feed that will show people the top three Trending stories, which they can click on to see the full list of Trending topics and explore what people are discussing on Facebook.

While most people will not see Trending in their News Feed as part of this small test, we hope that it will help us learn how to make Trending as useful and informative for people as possible. If you do see the Trending unit in your News Feed, you have the option to remove it in the drop-down menu which will prevent it from being shown to you in the future.

As before, we continue to listen to feedback about Trending and will keep making improvements in order to provide a valuable experience.

Categorías: Redes Sociales

Expanding Facebook Fundraisers to More People and Causes

Facebook - Mié, 05/24/2017 - 15:05

By Naomi Gleit, VP Social Good

Facebook is a place where people come together to connect with their communities and support one another in meaningful ways. Today, we are giving people another way to mobilize around causes they care about by expanding personal fundraisers to everyone over 18 in the US and by adding two new categories – community and sports.

We began testing personal fundraisers, a new product that allows people to raise money for a friend, themselves or a sick pet directly on Facebook, in March. Since then, we’ve been inspired by the response to create them and the support felt by those they benefit.

People can create a fundraiser to quickly raise money on Facebook and easily reach their friends in a few taps, without leaving Facebook, and can share fundraisers to help build momentum. People can learn about the person who created the fundraiser and the person benefiting from the fundraiser, as well as see which friends have donated. Now people can raise money for any of the following categories:

    • Education: such as tuition, books or classroom supplies
    • Medical: such as medical procedures, treatments or injuries
    • Pet Medical: such as veterinary procedures, treatments or injuries
    • Crisis Relief: such as public crises or natural disasters
    • Personal Emergency: such as a house fire, theft or car accident
    • Funeral and Loss: such as burial expenses or living costs after losing a loved one
    • Sports: such as equipment, competitions or team fees
    • Community: such as neighborhood services, community improvements or environmental improvements

Nonprofit fundraisers continue to be available for people on Facebook to raise funds and awareness for 501(c)(3) nonprofits.

It’s easy to get started:

  1. On mobile, tap the menu icon and select Fundraisers, or on desktop, go to facebook.com/fundraisers
  2. Choose to raise money for a Friend, Yourself or Someone or Something Not on Facebook
  3. Give your fundraiser a title and compelling story, and start raising money

All fundraisers are reviewed within 24 hours. Personal fundraisers are available on all devices, and have a 6.9% + $0.30 fee that goes to payment processing, fundraiser vetting, and security and fraud protection. Facebook’s goal is to create a platform for good that’s sustainable over the long-term, and not to make a profit from our charitable giving tools.

We’re constantly inspired by the good people on Facebook do, and we’re excited to learn more about how people use this new product so we can continue improving the experience.

Find out more about Facebook fundraisers at facebook.com/fundraisers.

Categorías: Redes Sociales

More Ways To Connect with Friends in Facebook Live

Facebook - Mar, 05/23/2017 - 18:00

By Erin Connolly, Product Manager, and Fred Beteille, Product Manager

We know Facebook Live is better with friends. We’ve been working on ways to make Live more fun, social and interactive, like with the new Live interactive effects we announced last month. Today we’re excited to announce two new features that make it easier to share experiences and connect in real time with your friends on Live.

Live Chat With Friends

One of the best things about Live is that you can discuss what’s happening in the broadcast in real time. In fact, people comment more than 10 times more on Facebook Live videos than on regular videos. When it comes to compelling public broadcasts — such as a breaking news event, a Q&A with your favorite actor or behind-the-scenes action after a big game — watching with the community and reading comments is an exciting part of the experience. We know sometimes people also want the option to interact with only their friends during a public live broadcast, so we’re rolling out Live Chat With Friends.

Live Chat With Friends lets you invite friends to a private chat about a public live broadcast. You can invite friends who are already watching or other friends who you think may want to tune in. You’re able to jump back into the public conversation at any time, and you can still continue chatting with your friends via Messenger after the broadcast ends.

With Live Chat With Friends, you can be part of big moments with the wider community but also have the option to participate in personal conversations with the people closest to you, directly within the Live experience. We’re testing this feature on mobile in several countries, and we look forward to making it available more broadly later this summer.

Live With
Last year we started rolling out the ability for public figures to go live with a guest. Now available for all profiles and Pages on iOS, Live With lets you invite a friend into your live video so you can hang out together, even if you’re not in the same place. Sharing the screen with a friend can make going live more fun and interactive — for both you and your viewers.

To invite a friend to join you in your live video, simply select a guest from the Live Viewers section, or tap a comment from the viewer you want to invite. Your viewer can then choose whether or not to join your broadcast. You can go live with a guest in both portrait mode (for a picture-in-picture experience) and landscape mode (for a side-by-side experience). For a full tutorial, click here.

We’re excited to see how people use these Facebook Live features to come together around moments big and small.

Categorías: Redes Sociales

Facebook’s Community Standards: How and Where We Draw the Line

Facebook - Mar, 05/23/2017 - 15:00

By Monika Bickert, Head of Global Policy Management

Last month, people shared several horrific videos on Facebook of Syrian children in the aftermath of a chemical weapons attack. The videos, which also appeared elsewhere on the internet, showed the children shaking, struggling to breathe and eventually dying.

The images were deeply shocking – so much so that we placed a warning screen in front of them. But the images also prompted international outrage and renewed attention on the plight of Syrians.

Reviewing online material on a global scale is challenging and essential. As the person in charge of doing this work for Facebook, I want to explain how and where we draw the line.

On an average day, more than a billion people use Facebook. They share posts in dozens of languages: everything from photos to live videos. A very small percentage of those will be reported to us for investigation. The range of issues is broad – from bullying and hate speech to terrorism – and complex. Designing policies that both keep people safe and enable them to share freely means understanding emerging social issues and the way they manifest themselves online, and being able to respond quickly to millions of reports a week from people all over the world.

For our reviewers, there is another hurdle: understanding context. It’s hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?

In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a jail sentence. Laws can provide guidance, but often what’s acceptable is more about norms and expectations. New ways to tell stories and share images can bring these tensions to the surface faster than ever.

We aim to keep our site safe. We don’t always share the details of our policies, because we don’t want to encourage people to find workarounds – but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why.

Our standards change over time. We are in constant dialogue with experts and local organizations, on everything from child safety to terrorism to human rights.  Sometimes this means our policies can seem counterintuitive. As the Guardian reported, experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats. When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time.

We try hard to stay objective. The cases we review aren’t the easy ones: they are often in a grey area where people disagree. Art and pornography aren’t always easily distinguished, but we’ve found that digitally generated images of nudity are more likely to be pornographic than handmade ones, so our policy reflects that.

There’s a big difference between general expressions of anger and specific calls for a named individual to be harmed, so we allow the former but don’t permit the latter.

These tensions – between raising awareness of violence and promoting it, between freedom of expression and freedom from fear, between bearing witness to something and gawking at it – are complicated, and there are rarely universal legal standards to provide clarity. Being as objective as possible is the only way we can be consistent across the world. But we still sometimes end up making the wrong call.

The hypothetical situations we use to train reviewers are intentionally extreme. They’re designed to help the people who do this work deal with the most difficult cases. When we first created our content standards nearly a decade ago, much was left to the discretion of individual employees. But because no two people will have identical views of what defines hate speech or bullying – or any number of other issues – we now include clear definitions.

We face criticism from people who want more censorship and people who want less. We see that as a useful signal that we are not leaning too far in any one direction.

I hope that readers will understand that we take our role extremely seriously. For many of us on the team within Facebook, safety is a passion that predates our work at the company: I spent more than a decade as a criminal prosecutor, investigating everything from child sexual exploitation to terrorism. Our team also includes a counter extremism expert from the UK, the former research director of West Point’s Combating Terrorism Center, a rape crisis center worker, and a teacher.

All of us know there is more we can do. Last month, we announced that we are hiring an extra 3,000 reviewers. This is demanding work, and we will continue to do more to ensure we are giving them the right support, both by making it easier to escalate hard decisions quickly and by providing the psychological support they need.

Technology has given more people more power to communicate more widely than ever before. We believe the benefits of sharing far outweigh the risks. But we also recognize that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation.

Categorías: Redes Sociales

News Feed FYI: New Updates to Reduce Clickbait Headlines

Facebook - Mié, 05/17/2017 - 19:00

By Arun Babu, Engineer, Annie Liu, Engineer, and Jordan Zhang, Engineer

People tell us they don’t like stories that are misleading, sensational or spammy. That includes clickbait headlines that are designed to get attention and lure visitors into clicking on a link. In an effort to support an informed community, we’re always working to determine what stories might have clickbait headlines so we can show them less often.

Last year we made an update to News Feed to reduce stories from sources that consistently post clickbait headlines that withhold and exaggerate information. Today, we are making three updates that build on this work so that people will see even fewer clickbait stories in their feeds, and more of the stories they find authentic.

  • First, we are now taking into account clickbait at the individual post level in addition to the domain and Page level, in order to more precisely reduce clickbait headlines.
  • Second, in order to make this more effective, we are dividing our efforts into two separate signals — so we will now look at whether a headline withholds information or if it exaggerates information separately.
  • Third, we are starting to test this work in additional languages.

How We Are Improving Our Efforts

One of our News Feed values is authentic communication, so we’ve been working to understand what people find authentic and what people do not.

We’ve learned from last year’s update that we can better detect different kinds of clickbait headlines by separately — rather than jointly — identifying signals that withhold or exaggerate information.

Headlines that withhold information intentionally leave out crucial details or mislead people, forcing them to click to find out the answer. For example, “When She Looked Under Her Couch Cushions And Saw THIS…” Headlines that exaggerate the details of a story with sensational language tend to make the story seem like a bigger deal than it really is. For example, “WOW! Ginger tea is the secret to everlasting youth. You’ve GOT to see this!”

We addressed this similarly to how we previously worked to reduce clickbait: We categorized hundreds of thousands of headlines as clickbait or not clickbait by considering if the headline exaggerates the details of a story, and separately if the headline withholds information. A team at Facebook reviewed thousands of headlines using these criteria, validating each other’s work to identify large sets of clickbait headlines.

From there, we identify what phrases are commonly used in clickbait headlines that are not used in other headlines. This is similar to how many email spam filters work.

Posts with clickbait headlines will appear lower in News Feed. We will continue to learn over time, and we hope to continue expanding this work to reduce clickbait in even more languages.

Will This Impact My Page?

We anticipate that most Pages won’t see any significant changes to their distribution in News Feed as a result of this update.

Publishers that rely on clickbait headlines should expect their distribution to decrease. Pages should avoid headlines that withhold information required to understand the content of the article and headlines that exaggerate the article to create misleading expectations. If a Page stops posting clickbait and sensational headlines, their posts will stop being impacted by this change.

As always, Pages should refer to our publishing best practices. We will learn from these changes and will continue to work on reducing clickbait so News Feed is a place for authentic communication.

Categorías: Redes Sociales

Connecting People With Mental Health Resources and Building a Safer Community

Facebook - Mar, 05/16/2017 - 22:00

By Antigone Davis, Global Head of Safety

May is Mental Health Awareness Month in the US, and this month Facebook is letting people know about our tools and resources we have developed for people who may be struggling. People may see videos or photos in News Feed for a broad awareness campaign about supportive groups, crisis support over Messenger and suicide prevention tools.

We’ve been committed to mental health support for many years, and this is one of the ways we’re working to build a safer and more supportive community on Facebook. As we continue to invest in new tools and resources, we hope Facebook can help provide support to more people over time. For example, Mama Dragons, a Utah community of mothers with LGBTQ children, uses Facebook Groups to share experiences and offer support.

Finding Supportive Groups

On Facebook, people can connect to groups that support them through difficult times. Throughout May, we’ll be helping more people find groups about mental health and well-being.

Crisis Support Over Messenger

People can talk in real time with trained crisis and mental health support volunteers over Messenger. Participating organizations include Crisis Text Line, the National Eating Disorder Association, Partnership for Drug-Free Kids and the National Suicide Prevention Lifeline. We are also happy to announce that we will be adding The Trevor Project, an organization focused on crisis intervention and suicide prevention for LGBTQ youth. The option will roll out over the next few months.

Suicide Prevention Tools and Resources

We’ve offered suicide prevention tools on Facebook for more than 10 years. We developed these in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, Forefront and Crisis Text Line, and with people who have personal experience thinking about or attempting suicide. Last year we expanded the availability of these tools worldwide with the help of over 70 partners, and we’ve improved them based on new technology and feedback from the community.

This month Instagram is also helping to raise awareness about mental health and the communities of support that exist on the platform. To learn more about the tools and resources available on Instagram and the #HereForYou initiative, visit instagram-together.com.

Together, we hope these resources help more people who may be struggling and and we’re continuously improving them to build a safer and more supportive community on Facebook.

Categorías: Redes Sociales

Video Carousel Ads on Smartphone Mobile Web

Facebook - Mar, 05/16/2017 - 19:00

During our regular reviews to ensure the accuracy of our systems, we recently found and fixed a bug that misattributed some clicks on video carousel ads as link clicks. This bug occurred when people were on mobile web browsers on smartphones — not on desktop or in the Facebook mobile app.

The bug affected billing only for the following conditions: for the video carousel ad unit; when the advertiser chose to bid on link clicks; and only for people who were on smartphone web browsers. In these cases, instead of being billed only for link clicks (clicks to an advertiser’s selected destination), these advertisers were incorrectly billed when people clicked on the videos in the carousel to enlarge and watch them. Advertisers will receive a full credit for the charges they incurred for these misattributed clicks.

Most consumers use Facebook through the app on their phones, and mobile web browser ad impressions make up a small percentage of the overall ads impressions people see on Facebook. Given that this bug related to mobile web for smartphones only, and specifically for video carousel ads that bid on link clicks, the impact from a billing perspective was 0.04% of ads impressions. Regardless of how many impressions were affected, we take all bugs seriously and apologize for any inconvenience this has caused.

Categorías: Redes Sociales

Join Facebook In Celebrating Moms Around the World

Facebook - Sáb, 05/13/2017 - 20:00

People come to Facebook to express their diverse opinions and experiences, yet there is one thing that brings the community together in celebration all over the world — moms!

In 2016, Mother’s Day drove more posts in one day than any other topic on Facebook, with more than 105 million Mother’s Day posts. In fact, to show thanks and gratitude, people came to Facebook to post photos and videos, which spiked significantly on Mother’s Day with over 850 million photos and videos shared.*

This year, Facebook is providing new ways for you to show mom or a loved one that you appreciate everything they do.

Send a Personalized Card

Today you might see a message from Facebook in your News Feed wishing you a happy Mother’s Day and inviting you to share a card with a mom or loved one.On mobile, you can personalize some of the cards by adding a photo, giving you the opportunity to share what Mother’s Day means to you.

Jazz Up Your Photos

In Facebook Camera, you can find some new colorful Mother’s Day-themed masks and frames. If you swipe right on News Feed, go to the effects tray and you’ll be able to find the effects in the mask and frame category.

Support a Cause In Honor Of Mom

In the US, start a Facebook fundraiser or donate to your favorite cause in honor of your mom. On Facebook, you can raise money for a nonprofit or people — for yourself, a friend or someone or something not on Facebook.

Show Your Thanks

You may see a new “thankful” addition to reactions during the days surrounding Mother’s Day. When a person chooses this temporary flower reaction, they’ll see something special that wraps around the post they’re reacting to.

Add a Sticker to Your Instagram Story

Finally, if you’re using Instagram, you may also see a set of new stickers to help you celebrate the mothers in your life. Simply open the camera and take a photo or video, then add as many stickers as you want.

We hope everyone will join in celebrating everything that mothers do for us on this special day. Even by simply wishing someone “Happy Mother’s Day” in the comments (Psst…try it and see what happens!) you will be contributing to a worldwide outpouring of appreciation for moms everywhere.

*Data from May 8, 2016

Categorías: Redes Sociales

Reducing Links to Low-Quality Web Page Experiences

Facebook - Mié, 05/10/2017 - 17:00

By Jiun-Ren Lin and Shengbo Guo

We want to help people build an informed community on Facebook. That’s why we’re always working to understand which posts people consider misleading, sensational and spammy so we can show fewer of those and show more informative posts instead.

We hear from our community that they’re disappointed when they click on a link that leads to a web page containing little substantive content and that is covered in disruptive, shocking or malicious ads. People expect their experience after clicking on a post to be straightforward.

Starting today, we’re rolling out an update so people see fewer posts and ads in News Feed that link to these low-quality web page experiences. Similar to the work we’re already doing to stop misinformation, this update will help reduce the economic incentives of financially-motivated spammers.

A More Informative Experience

We have had a policy in place since last year to prevent advertisers with low-quality web page experiences from advertising on our platform. Now, we are increasing enforcement on ads and also taking into account organic posts in News Feed.

With this update, we reviewed hundreds of thousands of web pages linked to from Facebook to identify those that contain little substantive content and have a large number of disruptive, shocking or malicious ads. We then used artificial intelligence to understand whether new web pages shared on Facebook have similar characteristics. So if we determine a post might link to these types of low-quality web pages, it may show up lower in people’s feeds and may not be eligible to be an ad. This way people can see fewer misleading posts and more informative posts.

Will This Impact My Page or Ad Account?

These changes will roll out gradually over the coming months. Publishers that do not have the type of low-quality landing page experience referenced may see a small increase in traffic, while publishers who do should see a decline in traffic. This update is one of many signals we use to rank News Feed, so impact will vary by publisher, and Pages should continue posting stories their audiences will like.

For advertisers and publishers looking for tips on how to improve their web experiences, please read the full Facebook Business post and visit our Help Center.

Categorías: Redes Sociales

Facebook Reports First Quarter 2017 Results

Facebook - Mié, 05/03/2017 - 22:13

Click here for details on Facebook’s financial results for the first quarter ended March 31, 2017.

Categorías: Redes Sociales

Game On: Games on Messenger Go Global with New Features and Games

Facebook - Mar, 05/02/2017 - 18:00

By Andrea Vaccari, Product Manager, Messenger

Today we’re starting to roll out Instant Games on Messenger more broadly for the 1.2 billion people who use Messenger every month. In addition, we’re also launching the new features that we exclusively previewed to developers at our annual F8 conference. These include rich gameplay features, which allow developers to create unique and sophisticated experiences and Game bots to help game makers surface exciting features like new levels and rewards. When developers start to take advantage of these new capabilities everyone wins with more dynamic and engaging gaming experiences.

Rich gameplay features such as turn-based games (our most requested feature) can also weave in leaderboards and tournaments, and offer more visually engaging and customizable game messages during play. Game bots help re-engage players by calling out new game options and encouraging competition with updates on the leaderboards.

One of the first games to take advantage of the new rich gameplay features is Zynga’s Words With Friends. Words With Friends, one of the most popular game apps of all time, is now available as a feature-rich, turn-based game right in Messenger.

 

People will begin to see new features as game developers start to incorporate the new capabilities with one of Instant Games on Messenger’s biggest hits, Blackstorm’s EverWing, being among the first to use Game bots. Depending on what device you use and where you are located, there are now up to 50 games titles available on Messenger, with more being introduced almost every week. We’re also excited to be bringing the world’s #1 pool game — Miniclip’s 8 Ball Pool — to Instant Games very soon.

The new Instant Games on Messenger will roll out over the next few weeks worldwide for both iOS and Android.

Categorías: Redes Sociales

Páginas

Subscribe to Develop Site agregador: Redes Sociales