Foreign Interference and Foreign Influence Operations

Foreign Interference or Foreign Influence (FI) covers malign actions taken by foreign governments or foreign actors, designed to sow discord, manipulate public discourse, discredit the electoral system, bias the development of policy, or disrupt markets, for the purpose of undermining the interests of a country and its allies.

Information Activities (IA), a particular form of FI, are all activities undertaken to shape public opinion or undermine trust in the authenticity of information. It includes the use of new and traditional media to amplify divides and foment unrest in a country, sometimes coordinated with illicit cyber activities.

Malign foreign influence efforts come from state actors, individuals, groups and organizations who are motivated by criminal intent and/or ideological objectives.

Democratic countries are especially vulnerable to foreign influence, due to a number of factors, including the protection of the freedoms of speech and expression, the reliance on social media and platforms, and the decreasing trust in traditional news media sources.

Online platforms facilitate the high-speed, large-scale and targeted spreading of conspiracy theories and disinformation. Users are automatically fed information confirming existing cognitive biases. It leads to polarised, emotional debates.

Given the above, when foreign influence and interference manifests itself through the active effort of spreading disinformation, the consequences can be dire. The use of disinformation is a revolutionary tool of warfare, which does not require direct acts of violence, but has significant potential to disrupt society, business, and politics. Disinformation efforts can be relatively inexpensive to mount and difficult to counter, efficiently without significant coordination and cooperation.

Although legislation can be an important component of countering foreign influence, such as the US Countering Foreign Propaganda and Disinformation Act of 2016, no legislation can keep pace with current technological advances, that increase the speed and sophistication by which foreign influence campaigns can be devised and executed.

Foreign influence and disinformation should be seen as a continuous, ongoing assault on democratic countries, rather than a series of discrete, targeted, event-specific campaigns.

False Information Operations include false narratives through traditional media and social media outlets to manipulate and mislead the population, and the weaponization of information to undermine organizations, democratic processes, or to polarize divisions.

With Deep Fake Technologies (DFTs) and Artificial Intelligence (AI), it is possible to put words into someone’s mouth, and alter images and videos.

With technologies like Deep Video Portraits (DVPs) that are used in Hollywood movies and even YouTube videos, manipulators can make the target blink, open the mouth, raise the eyebrows, and turn the head side to side.

Foreign influence operators back extreme political groups and provide financial and logistical support. They approach political and advocacy groups that promote a more friendly agenda towards the foreign government supplying the support, or support extremism and divisive views. In many cases, these groups may not even know the true source of donations or other support as they are skillfully engineered into a belief that makes their acceptance easier.

In most countries, there is no single or joint body in the government in charge of coordinating efforts to counter foreign influence campaigns, such as countering disinformation, or investigating support for extreme political groups. When issues are identified they are often dealt with in the same way as fire-fighting: reactively, dealing with the incident rather than proactively introducing greater safeguards and preventative measures.

Often the source of sustained disinformation campaigns targeting the public are not addressed due to a lack of coordination. No one seems to know who would be responsible for countering that particular threat. For the same reason, it is difficult to assess the true scale of the issue because no one has taken direct ownership of the problem.

According to the report of the US Cyber Digital Task Force, malign foreign influence operations include:

- Cyber operations targeting election infrastructure, such as voter registration databases, voting machines, or other critical infrastructure;

- Cyber operations targeting political organizations, campaigns, and public officials;

- Covert influence operations to assist or harm political organizations, campaigns, and public officials;

- Covert influence operations, including disinformation operations, to influence public opinion and sow division, such as the operation of social media pages and other forums that spread disinformation and divisive messaging to U.S. audiences; and

- Overt influence efforts, such as the use of lobbyists, foreign media outlets, and other organizations to influence policymakers and the public.

Understanding Deep Video Portraits (DVPs)

Deep Video Portraits (DVPs) represent a sophisticated advancement in the field of digital media manipulation and artificial intelligence. They utilize machine learning techniques, particularly deep learning, to generate or alter video footage in such a way that the facial expressions, head poses, eye movements, and even lip movements of a target person in a video can be altered convincingly. These capabilities make DVPs a powerful tool in visual media, with potential applications ranging from entertainment and film to virtual reality and teleconferencing.

Step 1: Data Collection and Training. The first step involves gathering extensive video footage of the target individual. This footage is used to train a neural network, which learns the nuances of the person's facial expressions, how their lips move when they speak, and how their eyes and head move in various situations.

Step 2: Modeling and Animation. Using deep learning algorithms, particularly those based on Generative Adversarial Networks (GANs) or Variational Autoencoders (VAEs), a model is created that can generate or alter facial expressions and movements. These models capture the high-level attributes of facial dynamics, allowing for the realistic re-animation of faces in videos.

Step 3: Re-Targeting. This process involves mapping the recorded movements and expressions of one person onto the video portrait of another person. Essentially, the facial expressions and head movements of a source actor can be transferred to the target video portrait, making it appear as if the target person is naturally performing these actions.

Step 4: Rendering and Refinement. The output is then fine-tuned to ensure that the lighting, texture, and shadows on the face match the original video footage as closely as possible, maintaining realism and preventing visual discrepancies that might reveal the manipulation.

The power of DVP technology introduces significant legal and ethical challenges. A very concerning misuse is the creation of deepfakes, where DVPs are used to produce misleading or harmful content, such as fake news, political misinformation, or non-consensual adult content.

Educating all employees in companies and organizations about the existence and capabilities of deepfake technology will foster a more discerning consumption of digital media. Balancing innovation with responsibility is key to harnessing the benefits of DVPs while minimizing their potential harms.