Deception, disinformation, misinformation, propaganda, and direct democracy.
"For where the lion’s skin will not reach, you must patch it out with the
– Lysander the Spartan
Misinformation is incorrect or misleading information.
Disinformation is false information, deliberately and often covertly spread, in order to influence public opinion, or obscure the truth.
Propaganda is a broader and older term. Propaganda uses disinformation as a method. While the French philosopher Jacques Driencourt asserted that everything is propaganda, the term is most often associated with political persuasion and psychological warfare.
Psychological warfare is the use of propaganda against an enemy (or even a friend that could become an enemy in the future), with the intent to break his will to fight or resist, or to render him favourably disposed to one's position.
In deception (according to Bell and Whaley), someone is showing the false and hiding the real. Hiding the real is divided into masking, repackaging, and dazzling, while showing the fake is divided into mimicking, inventing, and decoying.
Citizens are remarkably bad at detecting deception and disinformation.
They often trust what others say, and usually they are right to do so. This is called the "truth bias". People also tend to believe something when it is repeated. They tend to believe something they learn for the first time, and subsequent rebuttals may reinforce the original information, rather than dissipate it.
Humans have an unconscious preference for things they associate with themselves, and they are more likely to believe messages from users they perceive as similar to themselves. They believe that sources are credible, if other people consider them credible. They trust fake user profiles with images and background information they like.
Citizens must understand that millions of fake accounts follow thousands of real users, creating the perception of a large following. This large following enhances perceived credibility, and attracts more human followers, creating a positive feedback cycle.
People are more likely to believe others who are in positions of power. Fake accounts have false credentials, like false affiliation with government agencies, corporations, activists, and political parties, to boost credibility.
Freedom of information and expression are of paramount importance in many cultures. The more freedom of information we have, the better. But the more information we have, the more difficult becomes to understand what is right and what is wrong. The right of expression and the freedom of information can be used against the citizens. We often have the weaponization of information.
The Internet and the social media are key game-changers in exploiting rights and freedoms. In the past, a secret service should work hard to get disinformation in the press. Today, the Internet and the social media give the opportunity for spreading limitless fake photos, reports, and "opinions". Many secret services wage online wars using Twitter, Facebook, LinkedIn, Google+, Instagram, Pinterest, Viber etc. Only imagination is the limit.
Social media platforms, autonomous agents, and big data, are directed towards the manipulation of public opinion. Social media bots (computer programs mimicking human behaviour and conversations, using artificial intelligence) allow for massive amplification of political views, manufacture trends, game hashtags, add content, spam opposition, attack journalists and persons that tell the truth.
In the hands of State-sponsored groups these automated tools can be used to both boost and silence communication and organization among citizens.
Over 10 percent of content across social media websites, and 62 percent of all web traffic, is generated by bots, not humans. Over 45 million Twitter accounts are bots, according to researchers at the University of Southern California.
Machine-driven communications tools (MADCOMs) use cognitive psychology and artificial intelligence based persuasive techniques. These tools spread information, messages, and ideas online, for influence, propaganda, counter-messaging, disinformation, espionage, intimidation. They use human-like speech to dominate the information-space and capture the attention of citizens.
Artificial intelligence (AI) technologies enable computers to simulate cognitive processes, such as elements of human thinking. Machines can make decisions, perceive data or the environment, and act to satisfy objectives.
The objective of this web site: The rule of the people, by the people, and for the people, requires citizens that can make decisions in areas they do not always understand. We support the Federal Council's national strategy for the protection of Switzerland against cyber risks and its implementation plan, by embedding cyber risk awareness in organizational culture. We promote increased public awareness of disinformation activities by external actors, to improve Switzerland's capacity to anticipate and respond to such activities.
When citizens understand the above, they will be way more prepared to protect their families, their working environment, and their country.
Presentation for the Board of Directors and senior management
State-sponsored but independent hacking groups. The new long arm of States that exploits legal pluralism and makes the law a strategic instrument
About the presentation
According to Article 51 of the U.N. Charter: “Nothing in the present Charter shall impair the inherent right of individual or collective self-defense if an armed attack occurs against a Member of the United Nations, until the Security Council has taken measures necessary to maintain international peace and security.”
But, is a cyber-attack comparable to an armed attack?
There is no international consensus on a precise definition of a use of force, in or out of cyberspace. Nations assert different definitions and apply different thresholds for what constitutes a use of force.
For example, if cyber operations cause effects that, if caused by traditional physical means, would be regarded as a use of force under jus ad bellum, then such cyber operations would likely also be regarded as a use of force.
Important weaknesses of international law include the assumption that it is possible to isolate military and civilian targets with sufficient clarity, and to distinguish a tangible military objective to be attained from an attack.
More than 20 countries have announced their intent to use offensive cyber capabilities, in line with Article 2(4) and Article 51 of the United Nations (UN) Charter.
Unfortunately, these capabilities will not help when the attackers are State-sponsored groups, and the States supporting them, claim that not only they are not involved, but also that their adversaries (the victims) have fabricated evidence about it. This is a very effective disinformation operation.
Adversaries have already successfully exploited weakness of non-authoritarian societies, especially the political and legal interpretation of facts from different political parties. It’s difficult to use offensive cyber capabilities in line with democratic principles and international law, as it is almost impossible to distinguish with absolute certainty between nation-state attacks and attacks from state-sponsored independent groups.
Even when intelligence services know that an attack comes from a State that uses a state-sponsored independent group, they cannot disclose the information and the evidence that supports their assessment, as disclosures about technical and physical intelligence capabilities and initiatives can undermine current and future operations. This is the “second attribution problem” – they know but they cannot disclose what they know.
As an example, we will discuss the data breach at the U.S. Office of Personnel Management (OPM). OPM systems had information related to the background investigations of current, former, and prospective federal government employees, U.S. military personnel, and those for whom a federal background investigation was conducted. The attackers now have access to information about federal employees, federal retirees and former federal employees. They have access to military records, veterans' status information, addresses, dates of birth, job and pay history, health insurance and life insurance information, pension information, data on age, gender, race, even fingerprints.
Aldrich Ames, a former intelligence officer turned mole, has said: “Espionage, for the most part, involves finding a person who knows something or has something that you can induce them secretly to give to you. That almost always involves a betrayal of trust.”
Finding this person is much easier, if you have data easily converted to intelligence, like the data stolen from the U.S. Office of Personnel Management (OPM). This leak is a direct risk for the critical infrastructure.
There are questions to be answered, and decisions to be made, not only about tactic and strategy, but also political and legal interpretation.
Our catalog, instructor-led training in Switzerland, Liechtenstein, and Germany: www.cyber-risk-gmbh.com/Cyber_Risk_GmbH_Catalog_2019.pdf