Content: The word has become so ubiquitous that it has lost so much of its meaning. And yet, how can something as universally seen, consumed, shared and discussed as content be so poorly understood by nearly all parties?
In part one of this series, I discussed the current state of social content as it relates to the platforms responsible for its dissemination. Social networks are continuing to feel the bumps in the road that come with the territory of being industry pioneers—by creating a new space, they have assumed some responsibility in keeping their highly visible products safe for public use.
How much responsibility do social networks have for what gets posted?
Social networks seem to be at a crossroads. They’re trying to build communities while also raising revenue through advertising and paid promotions. If social networks want (and need) to make money, where do they step in, draw the line and say, “Enough is enough” when it comes to objectionable content and accounts?
More important, do the legit users, whether everyday people or brands, have any right to expect the networks to be able to control user content?
In some ways, it looks like the answer is no. Taken to task for not taking action against Alex Jones sooner, Twitter CEO Jack Dorsey was quoted as saying, “Accounts like Jones’ can often sensationalize issues and spread unsubstantiated rumors, so it’s critical that journalists document, validate and refute such information directly so people can form their own opinions. This is what serves the public conversation best.”
What can brands do to protect themselves?
In many ways, it seems as if the responsibility falls to brands and other users of social media to help create a safer space online. Large brands and agencies have begun to apply pressure on the social networks themselves by trimming back spends until their concerns are addressed. Others are becoming more aggressive in vetting the online social media influencers that engage on those platforms, carefully examining each piece of content they have produced and analyzing their followings.
Social media managers are also looking beyond follower counts and curating their own brand networks. They are monitoring followers on brand accounts and blocking users that seem suspicious or fake, and weeding out ones that regularly post offensive comments or content.
Ultimately, it is up to the brand to determine where they want to invest their resources and what level of brand safety they need in a world where content creation continues to accelerate. An upstart energy drink is going to have a different audience and line in the content sand from a manufacturer of baby food.
This isn’t going to be a problem that is solved overnight with a new algorithm. The stakes are too high and the spectrum of views on what is “right” and “wrong” is too broad.
In a society that values individuality and freedom of speech, it is unclear if the social networks can create a user experience that is universally considered both safe and engaging for brands and end-users. We are likely heading into a world of more fragmented and customized social media experiences, where what is considered “safe and appropriate” to a given user is what they are fed by the social networks.
That idea in itself is controversial, and I for one hope this debate continues both online and off.
Ted Murphy is founder and CEO of influencer marketing platform Izea.
from Adweek https://ift.tt/2zTBSZ7
via IFTTT
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.