How likely is it that Elon Musk (shareholder 1) & Saudi Arabia (S2) owning X hinders human rights? (*AI Research Date: 11/21/2025)
GoogleAI Mode: “It is widely considered highly likely that the current ownership structure and resulting policy shifts at X (formerly Twitter), driven by Elon Musk and involving Saudi Arabian investment, pose significant risks to human rights, particularly regarding freedom of expression, privacy, and user safety.
Human rights organizations and researchers have documented concerns and observed negative impacts:
Impact of Elon Musk’s Leadership
Reduced Content Moderation: Under Elon Musk, X significantly cut staff, including content moderators and a volunteer advisory group, the Trust and Safety Council. This has led to an increase in hate speech, harassment, and extremist content on the platform, according to reports from research groups like the Center for Countering Digital Hate (CCDH) and the Anti-Defamation League.
“Freedom of Reach” Philosophy: Musk’s approach emphasizes “freedom of speech, not freedom of reach,” meaning content might not be removed but its visibility is restricted. Critics argue this approach is inconsistently applied and allows harmful content to remain on the platform, potentially breaching corporate human rights obligations to protect users from harm.
Privacy Concerns: X’s updated privacy policy includes provisions for collecting biometric data, employment history, and metadata from private messages to train AI models. Amnesty International has warned that this “carte blanche approach to profit-making poses a serious risk to individual rights,” particularly the right to privacy.
Lack of Transparency and Accountability: Musk has engaged in legal battles with independent researchers and non-profits, such as the CCDH, who track hate speech and analyze content moderation decisions. This has been viewed as an attempt to stifle independent scrutiny and evade public accountability, making it harder to assess the platform’s impact on human rights.
Impact of Saudi Arabian Investment
Government Influence: Saudi Arabia’s Kingdom Holding Company (KHC), controlled by Prince Alwaleed bin Talal al Saud, is a major, though minority, shareholder in X. KHC is 16.9% owned by the Saudi Arabian sovereign wealth fund, the Public Investment Fund (PIF).
Past Abuses: Prior to Musk’s takeover, X (then Twitter) was accused in a lawsuit of helping Saudi Arabia commit human rights abuses by disclosing user data at a much higher rate than it did for Western countries. A former Twitter employee was even accused of accessing the personal information of more than 6,000 accounts on behalf of the Saudi government.
Conflict of Interest: The involvement of a government with an authoritarian record on human rights and suppression of dissent in a major global communication platform creates an inherent conflict of interest. Human Rights Watch has detailed numerous abuses linked to Saudi Arabia’s PIF, including arbitrary detentions and lack of transparency.
Overall, the combination of Musk’s content moderation rollbacks and the investment from an authoritarian state amplifies concerns that business interests and personal whims may take precedence over universal principles of free expression and human rights.”
How much did Elon Musk and Saudi Arabia make on X in 2025? (*AI Research Date: 12/22/2025 for this question and below).
GoogleAI Mode: “In 2025, the social media platform X, owned by Elon Musk and other investors including a Saudi Prince, generated over $2 billion in revenue in the first nine months, with projected total revenue of around $2.9 billion for the full year. However, the company reported a net loss of $577.4 million in Q3 2025 due to high costs and restructuring charges.
X is a private company, meaning Elon Musk and Saudi Arabia (via the Kingdom Holding Company investment firm) did not “make” a direct profit in the traditional sense of a publicly traded stock where they would receive dividends or capital gains from public market sales. Their “gain” is primarily in the form of increased valuation of their stakes, although the valuation of X itself has fluctuated and remains below its original $44 billion acquisition price in 2022.
Financial Overview
Company Performance: X’s revenue for the first nine months of 2025 was over $2 billion, an increase of approximately 18% year-over-year.
Profitability Metrics: Despite the net loss, X’s adjusted EBITDA (earnings before interest, taxes, depreciation, and amortization) was approximately $454 million in Q3 2025, a 16% increase from the same period last year. This metric suggests improved operational profitability, even with high one-time costs.
Ownership Stakes:
Elon Musk is the majority shareholder of X, which is now a wholly-owned subsidiary of his AI startup, xAI, after an all-stock deal in March 2025 that valued X at $33 billion.
Saudi Arabia’s Prince Alwaleed bin Talal, through his firm Kingdom Holding Company, is the second-largest shareholder and rolled over a $1.89 billion stake during the original acquisition.
Valuation Changes: While a Saudi Prince claimed in February 2025 that X had doubled in value, internal and external valuations later in the year were more conservative, with the xAI deal valuing it at $33 billion.
Their potential “earnings” on X in 2025 are therefore tied to the unrealized gains from the company’s fluctuating private valuation and the overall success of the intertwined xAI venture, which was valued at $200 billion in September 2025.”
What happens to Elon Musk & Saudi Arabia’s X if IHRL and Quranic human rights are implemented as a dual shield to protect human rights?
GoogleAI Mode: “In 2025, the hypothetical implementation of International Human Rights Law (IHRL) and Quranic human rights as a “dual shield” would fundamentally challenge the current operational model of X (formerly Twitter) under Elon Musk and Saudi Arabia’s Kingdom Holding Company. This dual framework would demand a shift from Musk’s “free speech absolutism” toward a system that prioritizes human dignity, privacy, and protection against harmful narratives.
- Reconciling Free Speech and Harm Prevention
IHRL Standards: Under IHRL, particularly the International Covenant on Civil and Political Rights (ICCPR), free expression is not absolute and can be restricted to protect the rights of others. X would be required to implement more robust content moderation to curb hate speech and misinformation, which human rights groups in 2025 have already blamed for fueling real-world violence.
Quranic Shield: Integration of Quranic principles would likely emphasize human dignity and the avoidance of “fitna” (social discord or chaos). This would necessitate stricter moderation of content that incites violence or targets marginalized groups, potentially conflicting with Musk’s current lean toward minimal intervention.
- Enhanced Privacy and Anonymity
Transnational Repression: Rights groups in 2025 have warned that X’s location disclosure features and lack of VPN protection risk the safety of activists and dissidents. A dual shield would mandate heightened privacy standards to protect users from state surveillance, particularly in regions with high rates of political repression.
Saudi Accountability: Saudi Arabia has been criticized for using social media to track and prosecute dissidents. Implementing this framework would theoretically force X to stop sharing sensitive user data with any authority—including its second-largest shareholder’s government—that uses it for arbitrary detention or execution.
- Corporate Responsibility and Governance
Human Rights Impact Assessments: X would need to conduct regular assessments of its algorithms and policy choices to identify and mitigate foreseeable human rights risks, as urged by international experts in late 2025.
Algorithmic Transparency: The “dual shield” would likely require X to move away from recommender systems that prioritize “outrage” and engagement over truth and safety.
Shareholder Influence: The role of the Saudi Kingdom as a major investor would face unprecedented scrutiny. The company would have to balance its financial ties with the legal necessity to protect users from the very government that holds a significant stake in it.
- Legal and Ethical “Dual Shield” Challenges
Interpretive Tension: While some scholars argue Sharia law and IHRL are largely compatible, tensions remain over specific interpretations of civil rights. A dual shield would require X to navigate these differences through “counter-exegesis”—engaging both modern law and scripture to define what constitutes “harmful” content in a digital age.
Regulatory Pressure: With the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA) already establishing legal obligations for platforms in 2025, a dual IHRL-Quranic shield would act as an additional ethical layer, holding X to a higher standard of global accountability.
AI responses may include mistakes. For legal advice, consult a professional.”
On a scale of 1 to 5, How likely is it that Elon Musk & Saudi Arabia owning X hinders human rights?
GoogleAI Mode: “In 2025, international human rights organizations and digital rights advocates generally rate the impact of Elon Musk’s and Saudi Arabia’s ownership of X as a significant risk to human rights, often placing it at a 4 or 5 on a 5-point scale of concern.
The primary factors contributing to this high risk level include:
- Rise in Hate Speech and Misinformation (Risk Level: 4/5)
Documented Increases: Studies in 2025 have found that hate speech on X increased by 50% over a recorded period, with transphobic slurs specifically rising by 260%.
Moderation Rollbacks: Critics argue that Musk’s “free speech” policies have removed essential safeguards against online harassment and incitement of violence.
Algorithmic Promotion: Research indicates that hateful content often receives significantly higher engagement (up to 70% more likes) than average tweets, suggesting the platform’s current design may amplify harm.
- Transnational Repression and State Surveillance (Risk Level: 5/5)
Saudi Influence: As the second-largest shareholder, the Saudi government’s involvement is seen as a direct threat to dissidents. In 2025, Saudi Arabia executed a record 347 people, including those convicted for peaceful online expression.
Compromised Anonymity: Rights groups like Human Rights Watch and Access Now have warned that X’s data-sharing practices and location disclosure features make the platform “unsafe” for activists and journalists, who risk state reprisal for their posts.
Censorship Compliance: Since the takeover, X has reportedly complied with a majority of government requests for user data or content removal, raising concerns about its role as a tool for authoritarian regimes.
- Lack of Corporate Accountability (Risk Level: 4/5)
Non-Cooperation with Research: Instead of addressing findings of increased hate speech, Musk’s administration has famously sued research groups that highlight these trends.
Evolving Safety Standards: The 2025 Social Media Safety Index gave X a low score of 30, noting that it provides protection against targeted harassment only “where required by local laws,” rather than as a universal standard.
Intertwined Interests: Musk’s other business interests—such as xAI and Tesla—rely on “political goodwill” from major investors like Saudi Arabia, potentially creating a conflict of interest where platform safety is traded for financial capital.”
If IHRL and Quranic human rights were applied, would X be responsible for Saudi Arabia’s executions of peaceful online dissidents?
GoogleAI Mode: “In 2025, applying International Human Rights Law (IHRL) and Quranic human rights to X would create a strong legal and ethical basis for holding the platform responsible for the safety of peaceful dissidents.
- Responsibility under IHRL
Under IHRL frameworks, specifically the UN Guiding Principles on Business and Human Rights, X has a “responsibility to respect” human rights.
Complicity in Human Rights Abuses: If X provides user data (such as IP addresses or phone numbers) to a government knowing it will be used to track, detain, or execute peaceful dissidents, it can be viewed as complicit in gross human rights violations.
Due Diligence Obligations: In 2025, legal standards like the EU’s Corporate Sustainability Due Diligence Directive (CSDDD) require companies to anticipate and mitigate the risk of their technology being used for “transnational repression”. Failing to protect the anonymity of high-risk users constitutes a breach of these obligations.
Arbitrary Deprivation of Life: IHRL prohibits the death penalty for non-violent “crimes” like social media posts, which violate the right to life (ICCPR Article 6). A platform that facilitates these executions through data sharing or lack of security fails its international duty to protect.
- Responsibility under Quranic Human Rights
The Quranic framework provides a distinct ethical “shield” that contradicts state-sanctioned violence against peaceful expression.
Protection of Life and Dignity: Quranic principles view life as a divine gift; hence, “none of the killings carried out by [unjust governments] can be called ‘execution for the sake of justice’ in the phraseology of the Holy Quran”.
Forbidden Complicity: Quranic ethics prohibit assisting in “sin and transgression.” Providing tools or data that lead to the unjust execution of a peaceful individual would be considered a severe ethical violation.
Accountability of Authority: Under this framework, no ruler or government has the divine authority to curtail God-given rights, such as the right to life and justice. If X’s ownership structure allows for such abuses, the platform would be viewed as violating the higher moral law it claims to respect.
- Practical Implications for X in 2025
If these dual shields were applied, X would be legally and morally required to:
Refuse Data Requests: Deny any government request for identifying information of peaceful dissidents.
Exit Repressive Markets: Under the UN Guiding Principles, if a company cannot mitigate the risk of involvement in gross abuses (like executions), it must consider “whether and how to exit” that market.
Remedy the Harm: If its actions directly led to an execution, X would be responsible for providing remediation to the victim’s family, according to both IHRL and the “right to remedy” in Islamic law.”
Leave a comment