Cross.social DAO
DAO
Cross.social is a newly developed social platform that can be tailored to suit any idea or project, allowing the author of the concept to build and cultivate a community around it. It can be seen as a personalized version of Facebook but with a Decentralised Autonomous Organisation (DAO) mechanism. However, let's take a step-by-step approach to understand it better.
What is a DAO?
A DAO is an entity that operates without a central leadership and is governed by the vote of all its members in proportion to the number of tokens they hold. Decisions are made from the bottom up, and the community is structured around specific rules that are enforced by blockchain technology.
Essentially, a DAO is a virtual organization that is directly controlled by its members through the use of tokens.
How does cross.social present itself?
Although the official cross.social website is not expected to launch until the end of the year, the initial users of the platform are already familiar with the platform's whitepaper, which outlines its vision, mission, and basic principles. The platform's primary slogan, which best encapsulates its essence, is "A home for uncensored, free speech, owned by every user."
What does uncensored, free speech mean? Are there no rules for the cross.social content?
In the era of Cancel Culture, where fundamental principles are being redefined, the concept of free speech has also evolved. While there may not be explicit bans on sharing ideas, societal fragmentation, public condemnation, and cancellations have significantly altered what can and cannot be expressed. Interestingly, this has led to a very centralized and often unnoticed approach to speech, where the owners of traditional media platforms determine what is deemed correct or incorrect. Using algorithms, fact-checkers, and their own biases, they categorize users and regulate their reach based on perceived correctness.
While this may seem reasonable, it is hard to ensure impartiality when bias is inevitable. Media giants have different ideologies and broadcast them to millions of users, leading to migration towards alternative channels. The global pandemic has facilitated this migration further.
Cross.social does not engage in centralized censorship of speech or fairness assessments, nor does it restrict disgruntled users' accessibility. However, this does not mean that anything goes on the platform.
What are the basic principles of cross.social?
Cross.social is built on several key principles that set it apart from other social media platforms.
Firstly, it is designed to be accessible to all users, with no centralized limits on communication or accessibility. Users have complete control over their privacy settings and can choose who to follow, what content to see, and who to communicate with.
Secondly, the platform is built on transparency, with all transactions visible on the blockchain in real-time. This provides an additional layer of security and accountability for users.
Thirdly, cross.social is highly adaptable and can be easily integrated into any project, idea, or community. This makes it a flexible platform that can accommodate a wide range of content creators and fans.
Finally, the platform offers a broad range of functionality that incorporates the best features of traditional media channels. This means that users can create and share content in a variety of formats and styles, making it an ideal platform for creative expression and community building.
What content is restricted on the cross.social?
Cross.social has automated algorithms in place to restrict prohibited activities, such as pornography, child abuse, animal cruelty, incitement to crime, illegal activities, terrorism, and other activities prohibited by global or local laws.
How does moderation of posts/profiles work on a decentralized platform?
Moderation of content and profiles on a decentralized platform like Cross.social works differently compared to conventional platforms. The platform operates like a small democracy, with voters who can cast their votes on decisions. When a report of inappropriate content is received, the DAO members will have to vote on the appropriateness of that content. They have no set rules about what is right and wrong, as they may represent completely different views, and therefore make decisions based on their personal beliefs. On a conventional platform, content is moderated by an algorithm or a single person. However, on a DAO platform, content is moderated by many people. Rejecting a viewpoint based on ideological convictions, when it does not contravene any law, is extremely difficult on such a platform since what is acceptable to one person may not be acceptable to another.
How is content moderation carried out?
Content moderation on the platform is carried out in two ways. Firstly, an automatic algorithm removes posts or profiles that violate platform policies. However, this system may not be perfect and can sometimes make mistakes. Secondly, any user can report content they believe is inappropriate, which is then queued for review by DAO members. These members have 30 days to review the content and vote on whether it violates the law or not, with the options being Yes, No, Abstain, or Veto. Based on the outcome of the vote, the post or profile in question may be removed or allowed to remain on the platform.