Turmoil At Big Tech’S Anti-Terrorism Group

LinkedIn
Facebook
WhatsApp
Telegram
Email

Turmoil at Big Tech’s Anti-Terrorism Group

Inside Two Years of Conflict and Controversy at GIFCT

TikTok’s Membership Bid Rejected

In March 2023, top executives from Meta, YouTube, Twitter, and Microsoft met virtually to discuss the potential membership of TikTok in the Global Internet Forum to Counter Terrorism (GIFCT). While TikTok had satisfied the necessary training requirements and addressed concerns about its ties to China, worries lingered regarding potential abuse of its membership. These anxieties were exacerbated by ongoing US discussions about a potential ban on the app and TikTok’s previous content moderation issues. Ultimately, TikTok’s application was rejected due to two board members abstaining from the vote. Notably, just a week later, researchers criticized TikTok for hosting content celebrating the Christchurch terrorist attack, content that could have been easily flagged and removed had TikTok been granted access to GIFCT’s threat-detection technology.

PornHub and X’s Departure

The board’s stringent vetting process was further highlighted by its decision to deny membership to PornHub’s parent company, raising concerns about its content policies. Conversely, the board quickly approved the French social app Yubo, which reported 50 suspicious accounts to law enforcement after receiving tailored advice from GIFCT.

More recently, despite Twitter (now X)’s relaxed content moderation practices under Elon Musk, which threatened GIFCT’s reputation and its member companies, Meta, Microsoft, and YouTube opted not to expel X from the board. However, this month, X quietly resigned from the board voluntarily.

Secretive Decision-Making and Funding Concerns

These membership decisions, previously undisclosed, reveal how Microsoft, Meta, YouTube, and X have controlled access to anti-terrorism guidance, influencing content across the internet. Our investigation further uncovers contentious fundraising practices and the consequences of inadequate quality control in GIFCT’s content flagging system.

Internal Tensions and Funding Dilemmas

Since its founding in 2016, GIFCT has been governed by the four tech giants. The consortium aims to address online harms, including child abuse and the illicit trade of intimate images, and has contributed to reducing unwanted content. Yet, the political dynamics within GIFCT have remained largely hidden.

GIFCT relies on voluntary contributions from its members to fund its operations. While Microsoft, Google, and Meta each donated at least $4 million between 2020 and 2022, and Twitter contributed $600,000, many other members paid nothing. This disparity angered some board members, leading to worries about the future of staff positions.

In response to these concerns, the board has explored alternative funding sources, including government grants and private foundations. While some critics questioned the ethics of such actions, records show that staff were considering a grant from the pro-Israel philanthropy Newton and Rochelle Becker Charitable Trust. However, Chowdhury Fink confirmed that GIFCT ultimately did not apply for the grant.

To ensure consistent funding, GIFCT’s bylaws have been amended to require minimum annual contributions from all members starting in 2025, though exceptions may be possible. Paying members will gain voting rights for two board seats, with eligibility contingent on a larger donation. X’s refusal to pay these contributions led to its resignation from the board, relinquishing its tie-breaking power. Notably, Meta, YouTube, and Microsoft could have removed Twitter from the board after Musk’s acquisition, but chose not to exercise this power.

Calls for X’s Removal and Concerns about Hash Database

Many individuals connected to GIFCT argue that Meta, YouTube, and Microsoft should permanently remove X from the consortium due to its alleged tolerance of extremist activity, including the reported sale of weapons. The Times recently reported on calls for X’s expulsion.

GIFCT’s “tech solutions” code of conduct allows for the banning of members for “sustained inappropriate behavior.” While X claims to have suspended over 57,000 accounts for violating its violent and hateful entities policy in the first half of 2024, it has yet to respond to requests for comment.

One of GIFCT’s most visible functions is its crisis response mechanism. The group utilizes technology to generate and upload hashes of harmful content to a database hosted on Meta servers. Members then compare this database against content on their own services to identify and remove problematic posts. After the Buffalo shooting in 2022, GIFCT alerted members and uploaded hashes of the shooter’s videos and manifesto.

While companies contend that hash sharing streamlines content moderation and prevents the spread of harmful content, GIFCT has faced criticism for its lack of transparency and safeguards. There is no public disclosure of the amount of content removed based on hash matches or the number of user appeals. Only YouTube has reported its contributions to the database, adding approximately 45,000 hashes last year. GIFCT has declined to disclose the number of hashes added by its staff or through tips from researchers and governments, and to what extent external hashes are verified.

Even the list of companies accessing the database is not publicly available, with only 13 out of 25 members having access as of last year. GIFCT does not know how many members manually review content before sharing hashes or taking action on matches. While the consortium maintains that innocent content rarely ends up in the database, it has not allowed external auditing or comprehensive internal reviews despite documented errors.

In 2022, one social media company found that two hashes from the database matched thousands of copies of the music video for “Never Gonna Give You Up.” This incident, previously disclosed by GIFCT without naming the song, highlighted the potential for errors in the system. A small audit conducted in 2022 flagged invalid submissions, but the exact number was not revealed. Despite this, the database size was reduced from 2.3 million hashes to 2.1 million in October 2022.

Calls for Transparency and Reform

This year, Australia’s eSafety Commissioner demanded information about the counterterrorism efforts of Google, Meta, and X. The results of this investigation are expected to be published soon.

Critics of GIFCT’s governance argue that the consortium’s founding members already possess the necessary tools and strategies to protect free speech and curb violence-inciting hate. A human rights impact assessment commissioned by GIFCT in 2021 recommended 47 changes, many of which remain unimplemented. The assessment advised approving the membership of high-risk, non-US companies with appropriate safeguards to minimize human rights harms. It also recommended requiring human review for all hashes contributed by members and replacing the board’s current composition with a more diverse and inclusive model, including activists and academics.

While GIFCT established an independent advisory committee in 2020, its recommendations have not always been implemented. Some members of the advisory panel, composed of academics, activists, and government officials, have felt ignored, with a few “quiet quitting” this year. The panel has expressed concerns about GIFCT’s focus on Euro-North American issues and its inadequate attention to issues like white supremacy and far-right violence. It has also called for greater emphasis on violence in Africa and Asia, and a shift away from an overemphasis on combating Islamist extremism.

Despite the criticism, GIFCT’s executive director acknowledges the value of the advisory panel’s feedback. However, internal records indicate that discussions between the panel and the board have been tense.

Controversial Involvement in the Israel-Hamas Conflict

GIFCT has faced controversy for facilitating the removal of content related to the Israel-Hamas war. The group has generally avoided content takedowns during conflicts, but its involvement in the Gaza crisis has been seen by some staff as siding with Israel.

Alternative Approaches and Future Implications

While critics of GIFCT’s management style exist, most acknowledge the importance of some form of coordination among tech companies to address online extremism. Without it, individual companies would struggle to combat interconnected threats, and governments might impose stricter censorship.

Companies rejected by GIFCT, including TikTok and PornHub, have found support from Tech Against Terrorism, an initiative funded by governments and tech companies. This organization provides automatic alerts about extremist content and will soon launch a certification program and its own database for sharing hashes. GIFCT previously paid Tech Against Terrorism to evaluate and train potential members, but a growing rivalry between the two organizations led to the termination of this contract. Starting next year, GIFCT staff will handle training, consolidating control over this process.

The future of GIFCT remains uncertain. While the consortium has contributed to reducing online extremism, its lack of transparency, internal conflicts, and questionable practices raise serious concerns. As the landscape of online threats continues to evolve, the need for a more transparent, effective, and inclusive approach to counterterrorism is critical. It remains to be seen whether GIFCT will rise to this challenge or become a relic of a flawed approach to online safety.

Fonte da Notícia: https://www.wired.com/story/gifct-x-meta-youtube-microsoft-anti-terrorism-big-tech-turmoil/

Veja Nossas Postagens: https://ebettr.com.br/postagens/

Picture of Jonathas Oliveira
Jonathas Oliveira
Postagens Relacionadas
{{ reviewsTotal }}{{ options.labels.singularReviewCountLabel }}
{{ reviewsTotal }}{{ options.labels.pluralReviewCountLabel }}
{{ options.labels.newReviewButton }}
{{ userData.canReview.message }}

Postagens Relacionados

Produtos Relacionados

Please select listing to show.