China’s Influence Operations

0
136

There is substantial disagreement about what actions qualify as an influence operation, but for the sake of this article, we will refer to them as covert or dishonest attempts to sway the opinions of a target audience. It should be noted that our definition is unconcerned with the message’s veracity (whether the information conveyed is accurate or not) or the source of the information.

Operations that aim to persuade an audience of a particular viewpoint, energize those who hold specific views, and/or divert target audiences are all examples of influence operations.

Propagandists are vying for users’ already-diminishing attention on social media platforms, according to the logic of distraction. Propagandists could successfully capture user attention without necessarily persuading them, if they can divert target audiences from an unfavorable narrative developing on social media by disseminating alternative theories or muddying the information environment.

Although influence operations can take many different shapes and employ a variety of techniques, there are a few common threads that connect them all. According to a recent study on political influence operations in the Middle East, these operations frequently used one of the following strategies:

• Making an effort to promote one’s own government, culture, or policies;

• Making an argument in favor of or against particular policies.

• Attempts to destabilize foreign relations or domestic situations in rival nations;

• Attempts to make allies appear good and opponents look bad to third-party countries

In a number of these instances, the accounts carrying out the operation pretended to be locals voicing their displeasure with their government or certain political personalities. This strategy of using digital agents of influence is frequently used by social media manipulation operations to conceal the genuine identity of the content source from the target audience. For instance, the Internet Research Agency (IRA) accounts of Russia sent direct messages to members of the Black American and conservative American communities while posing as activists. A misused idiom, a recurring grammatical fault, or even the use of a backtick (‘) where a real speaker would use an apostrophe (‘) are some subtle indicators that might help you spot these false reports. State-level antagonistic actors frequently deploy a variety of strategies, drawing on their own workforce or contracting out to virtual mercenaries.

Since 2016, Twitter and Meta have taken down well over 100 social media influence operations that originated in dozens of different nations. Inauthentic amplification attempts, fake news websites, and persona creation (the creation of false personas to disseminate a message) are frequently used in these operations.

However, influence operations have also greatly increased outside of Facebook and Twitter to other platforms, intimate settings, and encrypted areas. The New York Times reported on how “Iranian agents had infiltrated small [Israeli] WhatsApp groups, Telegram channels and messaging apps” to promote divisive messages. The report was done in collaboration with Israeli disinformation researchers. These influence operations occasionally show inventiveness by using platform rules in an offensive way.

For instance, a campaign in favour of the Tanzanian government that Twitter deleted in 2021 targeted Tanzanian activists’ accounts by making fake allegations of copyright reporting.

Matrix of Chinese Influence Operations
Matrix of Chinese Influence Operations

Recent studies and media coverage on influence operations have tended to concentrate on foreign campaigns, in which governments or individuals target individuals in other countries. But influence efforts can also be domestically focused, as the Tanzanian case demonstrates. Political actors frequently disseminate clandestine propaganda directed at their constituents in an effort to increase their popularity, detract from that of an adversary, or create commotion inside the political system. Facebook terminated phony accounts associated with Brazilian politicians, President Jair Bolsonaro, and his sons, Congressman Eduardo Bolsonaro and Senator Jair Bolsonaro, in 2020 for disseminating divisive content about Brazilian politics. In fact, a lot of critics think that internal influence operations-rather than foreign ones-are the most concerning.

Additionally, influence operations have been used to take sides in intraparty disputes and, in the case of some campaigns linked to the Chinese Communist Party, to hit expatriate groups.

Techniques Employed

China employs a variety of techniques for influence operations to shape opinions, advance its interests, and project power. These techniques can be both overt and covert, and they are often coordinated by the Chinese Communist Party (CCP) and its various government agencies. Here are several techniques that China has been known to use:

Propaganda and Media Influence: China maintains a vast propaganda apparatus that includes state-controlled media outlets such as Xinhua News Agency, CCTV, and People’s Daily. These outlets disseminate narratives aligned with the CCP’s objectives and promote positive portrayals of China. They also engage in information warfare by spreading disinformation, censorship, and manipulating online discussions through social media platforms.

Economic Leverage: China utilizes its economic power to exert influence. It may offer financial aid, investment, or loans to other countries, attaching conditions that align with its political objectives. This can create dependencies and influence decision-making processes in target nations.

Diplomatic Influence: China engages in diplomatic efforts to shape global narratives, build alliances, and influence international organizations. It actively seeks to cultivate relationships with political leaders, academic institutions, and think tanks globally. It hosts conferences, forums, and summits to promote its interests and propagate its narratives.

United Front Work: The CCP employs a strategy known as “United Front Work” to co-opt and influence individuals and groups outside of its formal political structure. This includes overseas Chinese communities, ethnic or religious groups, business leaders, and intellectuals. The goal is to create a favorable perception of China and gain support for its policies.

Cyber Espionage and Information Warfare: China has been accused of engaging in state-sponsored cyberattacks and espionage to steal intellectual property, gain access to sensitive information, and manipulate public opinion. It also employs “50 Cent Army” or “Wu Mao” internet trolls who spread pro-China propaganda and disrupt discussions critical of the regime.

Confucius Institutes: China has established Confucius Institutes in various countries, which are cultural and educational centers that promote Chinese language and culture. However, they have been criticized for engaging in censorship, promoting self-censorship, and advancing Chinese Communist Party propaganda.

Overseas Investments and Infrastructure Projects: China’s Belt and Road Initiative (BRI) is a large-scale infrastructure development project that seeks to enhance connectivity across Asia, Europe, Africa, and beyond. By investing in critical infrastructure projects in other countries, China can deepen its economic ties, gain influence, and extend its geopolitical reach.

United Front Work

United Front Work is a strategy employed by the Chinese Communist Party (CCP) to co-opt and influence individuals and groups outside of its formal political structure. Here are a few examples of United Front Work activities carried out by China:

Overseas Chinese Communities: China seeks to influence overseas Chinese communities by establishing various organizations and associations that promote Chinese culture and interests. These organizations often operate under the umbrella of the CCP’s United Front Work Department. They aim to maintain ties with overseas Chinese, encourage support for China’s policies, and discourage dissent or criticism of the CCP.

Ethnic and Religious Groups: China targets ethnic or religious groups within its borders and abroad through United Front Work. For example, it seeks to co-opt Tibetan and Uighur diaspora communities by promoting narratives that downplay human rights concerns and highlight economic development and cultural preservation efforts in Tibet and Xinjiang. China also uses United Front tactics to exert influence over overseas Chinese Muslim organizations.

Business and Intellectual Elites: China actively cultivates relationships with influential business leaders, academics, and intellectuals around the world. Through various means, such as funding research institutions, supporting academic exchanges, and organizing conferences, China seeks to shape narratives, gain access to cutting-edge technologies, and project soft power. These efforts aim to build networks of individuals who are sympathetic to China’s policies and willing to advocate for them.

Political Parties and Politicians: China seeks to cultivate relationships with politicians and political parties in other countries. It may offer financial support, establish friendship associations, or facilitate exchange programs to build political influence. This can involve providing financial assistance, investment opportunities, or other incentives to countries or politicians that align with China’s interests.

Media and Academic Institutions: China invests in media outlets and academic institutions worldwide to enhance its soft power and control narratives. This includes establishing partnerships with foreign media organizations, sponsoring Confucius Institutes in universities (which have been accused of promoting CCP propaganda), and funding research centers or think tanks that produce China-friendly analysis.

Operations and Impact of Influence

Influence operations may have an effect due to their particular content or focus (such as through persuasion) or by undermining public confidence in the information environment as a whole.

Resources, message quality, and operation detectability are sometimes constraints on the direct impact of content in modern influence operations. Depending on the objectives of the operator, these aspects may be more or less important. For instance, the effectiveness of each individual message is probably much less important if operators are merely trying to divert targets rather than persuade them of a particular opinion. Theoretically, language models might one day partially circumvent these limitations.

It is more important to create the impression that any given message can be false or deceptive in order to affect trust in an information environment. Even if influence operations fail to alter people’s opinions, they may cause people to doubt the veracity of information they get from even reliable sources, which could undermine trust in democratic and epistemic institutions more generally.

Content-Based Impact

If an influence operation

• Persuades someone of a specific position or supports an existing one,

• Distracts them from discovering or developing other ideas, or

• Distracts them from making any room for higher quality thought, then it may have an impact dependent on the content.

Often, the intention is to merely divert attention away from information that could be harmful to the operator. Distraction operations frequently take advantage of and worsen preexisting attention rivalries between advertisers, media sources, and platforms to crowd out crucial information with attention-grabbing, irrelevant material. Therefore, distraction operations do not require a target to be convinced by the information spread, but rather that a target not be convinced by (or even take into consideration) any other piece of information.

The impact of an influence operation can be quantified or tracked in both historical and modern cases. For instance, the Soviet Union engaged in an influence campaign to disseminate the myth that the United States government invented the virus in a lab during the HIV epidemic in the 1980s. According to a 2005 research, 27% of African Americans still held this belief. In 2016, the IRA organized rallies and counter protests outside the Islamic Da’wah Center in Houston using manipulative agents of influence on Facebook. As a result of the protests’ inability to take place without the IRA’s involvement, the impact is extremely simple to identify in this case.

While platforms assign blame and give researchers information about influence operations that were shut down, researchers still have little visibility into the impact on users or their post-engagement behavior. Additionally, not every influence operation is discovered. Due to multicausality and the obstacles associated with tracking changes in public opinion over time, even propagandists who try to gauge their own influence occasionally run into problems. Scholars have emphasized that historically, this uncertainty has helped intelligence agencies exaggerate the impact of their influence operations for administrative gain.

Despite these measuring difficulties, several characteristics, such as resources, material quality and messaging, and detectability, obviously restrict the impact of current efforts. Following a brief description of these drawbacks, we go into how generative models might be able to assist you get over them.

• Resources: Just like marketing efforts, the effectiveness of an influence operation depends on one’s ability to reach the target audience with the required material. A political actor employs how many propagandists to write material. How many social media accounts can someone open up to create a false online presence? Low-resource initiatives have a lower chance of reaching their target audience or getting media attention.

• Content’s Quality and Message: People are less likely to be persuaded by messaging if it sharply contradicts their preexisting viewpoint or if the arguments are shoddily put forth or illogically reasoned. Above all else, campaigns with messaging that contradicts targets’ beliefs, fails to successfully integrate with a target’s information environment, and presents weak arguments should be avoided.

• Detectability: Lastly, actions that are rapidly found have a lower likelihood of having an effect. Independent researchers and social media platforms aggressively look for influence operations, which the platforms then take down to reduce their reach. In fact, propagandists’ awareness that these operations can be abandoned might influence their actions, encouraging them to pursue diversion strategies rather than persona creation, which takes more time and effort but can be more persuasive to observers.

It is beneficial to keep these restrictions in mind as we think about the potential contribution of language models to influence campaigns. They could pose a serious problem for the information environment if they can get beyond the current barriers. In Section 4, we go into more detail about this.

Trust-Based Downstream Impact

The second method influence operations might have an effect is via undermining trust. Even when influence campaigns are recognized, their presence, especially at scale, may lead people to become skeptical of other, authentic sources. Degrading public trust does not necessarily require high quality initiatives. Particularly in areas where information technologies make it challenging to determine the reliability of sources, propagandists frequently seek to take advantage of weaknesses in their target’s mental shortcuts for gaining confidence. Influence operators have the ability to erode credibility beyond the narrow focus of their campaign by faking or using deceptive credentials and testimonials, as well as by altering with photographic and video evidence. Lowering societal trust might make it more difficult for a society to plan quick responses to emergencies, which could be an admirable objective for antagonistic actors in and of itself. As a result, propagandists can further their goals in an environment where there is less societal trust. Existing societal division and polarization make it harder for trustworthy actors to gain widespread respect. They can also provide influence operators a foothold to target their messaging at specific audiences, cause discord, and undermine institutional and social confidence. The norms that allow people and organizations to connect and collaborate without the need for numerous rules and processes to manage their behavior are threatened by low universal trust.