With social media usage continuing to develop and increase, businesses need to pay attention to its third party user-generated content.
Recent decisions from the Advertising Standards Board (ABS) (Smirnoff & Facebook page) and the Allergy Pathway case in the Federal Court (ACCC v Allergy Pathway No. 2), indicate businesses need to be aware of the potential responsibility for user-generated or third party content on their social media platforms. Although compliance with advertising codes is voluntary and determinations by the Board not legally binding, the position of responsibility for content posted by third parties is unclear under Australian law.
What does this mean for businesses and their brand on social media? If anything it requires that businesses be more vigilant about the nature of comments posted on their social media platforms. Although, constant moderation is currently not required and even if this area of law is not clear, having a proactive strategy is a positive and practical approach.
Here are simple things to increase your protection.
Make sure your organisation has a social media policy which is being followed. Attach this policy to your social media platforms and website so that third parties are aware of what your guidelines are. This way your organisation has a consistent structure of procedures.
2. Moderate
There are certain levels of moderation that you are able to achieve through logging in as a page administrator. For example, Facebook has a feature that allows particular words to be blocked from appearing in comments on the page. This is a good starting point.
3. Monitor
It is important to check what individuals are writing on your pages not because you are required to but because it is a sensible approach. If some third party content is possibly inappropriate, then as soon as you become aware of it, it is practical to remove the content.
4. Act Promptly
When business receives a complaint in relation to third party content it is important to act quickly. Such complaints make the business aware of a particular issue so they should act to review immediately and remove the content if necessary.
Although user-generated content can be difficult to manage particularly if your organisation has thousands of likes or followers, it is something organisations should be actively doing. Most people who submit content have the very best intentions, however, there will always be someone who wants to air negative grievances just to cause trouble. So organisations need to be on the lookout for inappropriate or misleading content. Ideally, all posts should only be published once they have been reviewed and moderated by the organisation.