You can now listen to Antigua News articles!
By Andrecia Lewis
Would it be ethical to hold a bookstore owner accountable for the content of the books in their library? Many would argue such a practice to be unfair. After all, the content of the book is the responsibility of the writer. If this principle was applied to Big Tech social media sites, would it be unjust for these companies to be indicted for user content? Or should the responsibility fall solely in the hands of the users?
Numerous scientific studies have been conducted listing social media platforms as a major contributing factor to many social ills. Frequently, instances of bullying, harassment, misogyny, racism, prejudice, misinformation and disinformation remain unchecked on these platforms. Recently, Argentina’s President Javier Milei endorsed a cryptocurrency coin on X, causing its value to increase. However, it dropped suddenly causing many traders to lose a significant amount of money. His tweet was swiftly deleted only after the damage had already occurred. Additionally, social media has been regarded as a hub for terrorist organizations to spread propaganda and recruit individuals. Again, should the blame be laid on the user, or the platforms lack of swift and effective regulatory controls?
Stricter regulations on these platforms may seem to be the perfect solution in theory. However, is it possible for social media sites to monitor and regulate every interaction? Most importantly, how can this be carried out without infringing upon user’s privacy and freedom of speech? Furthermore, these regulations can exist, however can they be enforced?
The European Union developed the Digital Services Act, (DSA) to hold tech giants accountable for the content on their platforms. This law aims to protect citizens, especially minors, banning targeted ads to persons between the ages of 13-17. Additionally, The DSAbans psychographic profiling, or targeted ads based on a person’s race, religion, attitudes etc. Moreover, it requires users to be able to opt out of the algorithm or personalized content on their News Feed, which decreases filter bubbles.
In contrast, Section 230 of the United States’ Communications Decency Act seeks to do the opposite. These “26 words that created the internet” protects social media platforms and tech giants from being held liable for the content that is posted by users on their platforms. The reality is, if this was not the case, the internet and social media as we know it today would not exist. Whether this is good or bad is all a matter of subjective opinion.
What does all of this mean for small island developing states of the Caribbean? Well, CARICOM can follow in the footsteps of the European Union and create a policy to protect citizens and attempt to force tech giants into submission. Or perhaps, this is a concern which is not of immediate relevance to SIDS. Instead, CARICOM’s focus should be on the development of a comprehensive and effective CARICOM Digital Literacy Policy.
yep it should, its like there is no limit on social media
Yes totally agree. Social Media is damaging. People can do and say whatever they want just becocuse they can