Lessons in Digital Technology Learned from the Invasion of Ukraine

March 16, 2022
Digital Policy Forum Japan

 International peace is the most sacred thing desired by the people of the world. The recent invasion of Ukraine by Russia is unacceptable from any perspective. The Digital Policy Forum Japan (DPFJ) maintains this statement as fact, and is working to collect example cases relating to digital technology in the invasion; we believe that organizing issues gathered from these cases, utilizing them to add to the discourse on digital policy within the DPFJ (5 Digital Policy Agenda Items[1] ), and bringing them to the public as a topic of discussion on the invasion and digital policy is both important and beneficial. As the invasion is ongoing, we will update this document moving forward. This document does not represent the opinion of all DPFJ members; rather, it was created by those members who stand in agreement with its intent (proponents are listed at the end of the document).
 The DPFJ will take this discussion beyond the borders of Japan, and proactively progress discussions together with international stakeholders moving forward.

Point 1: The Usage of Digital Technology in Hybrid Warfare

 Digital technology has come to form the foundation of the nervous system of nations. During conflicts, cyberattacks are carried out from before actual conflict begins through to the beginning of physical attacks. Disinformation is also being used to attempt to manipulate public opinion more frequently than ever.
 For example, there was a significant increase in DDoS attacks against Ukraine leading up the Russian invasion[2] , and a great many attempts made use disinformation to manipulate public opinion (especially falsified videos) [3]. The spread of misinformation undermines the integrity of information, and is therefore considered a form of cyberattack.
 These issues pose challenges during hybrid warfare (gray zones) that require nations to consider issues including the collaboration between public and private sectors in fields like the sharing of information during transitions to emergency situations (both in cyber and real space), establishing a response system and determining who should handle escalation and how, and how to establish a national cyber defense system.

Point 2: Reinforcing Anti-Disinformation Systems in Both Emergency and Non-Emergency Situations

 Much of the disinformation in circulation and many fake accounts spreading it of late are suspected to be state backed or generated by AI[4] . There are also many indications that much of the disinformation we are seeing was created prior to the invasion. These observations have been made based on objective evidence gathered by fact-checking organizations such as Bellingcat[5] in the UK in cooperation with preexisting media outlets [6].
 All the while, Russia is moving to severely restrict internet access, even blocking social media giants like Facebook[7] , and a law prohibiting the dissemination of “disinformation” regarding the Russian military has been passed and enforced [8]. This has lead to Western media outlets to cease or reduce their activities [9].
 While making all-encompassing legal regulation on disinformation is not appropriate, it is still necessary to establish a system to block or suppress disinformation with suspected state involvement during times of emergency (through labeling disinformation with alerts and warnings, disabling retweets on disinformation, and suppressing the display of disinformation, etc.). It is also important for the state to analyze disinformation the perspective of national security. It is important that both public interest and the protection of the human rights are kept in mind during such analysis. Co-regulation is appropriate for handling disinformation during peace times, and it would be prudent use caution in suddenly expanding legal restrictions even during times of emergency.
 States should also consider expanding support for private-sector fact-checking organizations that promote public interests (e.g., expanding tax systems for donations).
 Further, states could provide active support for the development of AI-based technologies that can discern disinformation and evaluate the credibility of information, as well as promote the dissemination of the results these technologies yield.
 Of course, the efforts covered here under Point 2 would need to be properly balanced with freedom of speech and freedom of the press.

Point 3: The Usage of Digital Technological Within Areas Under Conflict

 The development of the invasion has given rise to new initiatives and considerations regarding the use of digital technology. As mentioned in Point 2, fact-checking organizations are working to analyze disinformation through the verification of the date of creation of associated files, and collecting evidence through image comparison with done satellite imaging. In addition, they are giving visible shape to the damage done by mapping[10] incidents in Ukraine using OSINT (Open Source INTelligence). SpaceX is also quickly working to provide satellite broadband service (Starlink) [11].
 There have also been economic sanctions been imposed on Russia, locking some Russian banks out of the international payments system (SWIFT), as well as financial support sent Ukraine using digital content as non-fungible tokens (NFTs) and placing them on marketplaces [12].
 The Ukrainian government made direct requests to SpaceX for satellite broadband services via social media. They also called on major media providers to stop providing services in Russia [13], as well as other requests directly to individual companies in the private sector.
 While giving visible shape to the damage caused can help reveal excessive aggression (ensuring transparency through real, accurate information) and garner public interest internationally, it can also cause the identification of attack targets. States need to consider how information should be handled locally during times of emergency (including on social media, etc.).
 There are also a number of issues regarding economic assistance and sanctions using of digital technology to be considered. For example, it has been pointed out[14][15] that a ban from SWIFT may accelerate the shift to crypto assets (which allow how high levels of anonymity, and for which it is and difficult to suspend accounts).
 Further, there are problems in imposing sanctions through blocking the internet over an entire nation in a conflict. Such sanctions restrict the means the people of that country can use to obtain information both from and outside of their country, making it impossible for them to make the right decisions for their situation.
 We also need to examine the appropriateness of using AI, drones, and robots in war.
 Regardless, we are seeing developments with no precedent, including economic support using NFTs, as mentioned above. There is little doubt that new uses for digital technology will emerge moving forward. 
 In addition, a number of US companies are suspending digital services and products in Russia [16][17] . We need to examine whether or not leaving this decision up to individual companies in the private sector is appropriate, or if this should be considered a part of economic sanctions (digital sanctions) that governments take as part of their foreign strategy.

[1]https://www.digitalpolicyforum.jp/about
[2]https://www.itmedia.co.jp/news/articles/2202/26/news086.html
[3]https://www.buzzfeed.com/jp/kotahatachi/debunk-ukraina
[4]https://gigazine.net/news/20220301-ai-made-sns-pic/
[5]https://www.bellingcat.com/
[6]https://www.nikkei.com/article/DGXZQOFH231ER0T20C22A2000000/
[7]https://www3.nhk.or.jp/news/html/20220227/k10013503361000.html
[8]https://www.bloomberg.co.jp/news/articles/2022-03-04/R88AHVDWX2PV01
[9]https://www.jiji.com/jc/article?k=2022030600335&g=int
[10]https://maphub.net/Cen4infoRes/russian-ukraine-monitor
[11]https://www.nikkei.com/article/DGXZQOGN270KQ0X20C22A2000000/
[12]https://www.nikkei.com/article/DGKKZO58907180Y2A300C2EE9000/   Includes members-only articles
[13]https://www.nikkei.com/article/DGKKZO58612800X20C22A2TB0000/ Includes members-only articles
[14]https://www.nikkei.com/article/DGKKZO58766720T00C22A3EE9000/  Includes members-only articles
[15]https://forbesjapan.com/articles/detail/46121/1/1/1
[16]https://www.reuters.com/business/microsoft-suspends-product-sales-services-russia-2022-03-04/
[17]https://www.itmedia.co.jp/news/articles/2203/09/news124.html

以上

[Digital Policy Forum Japan]

■DPFJ Incorporators
Masaru Kitsuregawa (Director of the Inter-University Research Institute Corporation / Research Organization of Information and Systems; Distinguished professor at Tokyo University)
Osamu Sudo (Director of the ELSI Center at Chuo University, professor at the Global Informatics Faculty of the same;Specially-appointed professor at the Tokyo University Gaduate School, professor emeritus at Tokyo University)
Hideyuki Tokuda (Chairman of the National Institute of Information and Communications Technology, professor emeritus at Keio Univeristy)
Ichiya Nakamura (President of iU)
Masao Horibe (Professor emeritus at Hitotsubashi University)

■DPFJ Proponents
Nanako Ishido (Professor at the Keio University Graduate School of Media Design)
Youichiro Itakura(Partner at Hikari Sogoh Law Offices)
Takashi Uchiyama(Professor at the Aoyama Gakuin University School of Cultural and Creative studies)
Hiroshi Esaki (Professor at the Tokyo University Graduate School of Information Science and Technology, representative of the WIDE Project)
Mirai Odagiri(Project Researcher at the Tokyo University Institute for Future Initiatives,)
Nobuko Kawashima(Professor at the Doshisha University Faculty of Economic)
Naoto Kikuchi (Project Professor at the Keio University Graduate School of Media Design)
Tatsuya Kurosawa (Representative director of Kuwadate Co., Ltd. , Specially-appointed associate professor at the Keio University Graduate School of Media and Governance)
Jiro Kokuryo(Professor at the Keio University Faculty of Policy Management )
Masayoshi Sakai (Counselor for the Information-technology Promotion Agency, associate professor at iU)
Masahiko Shoji(Professor at the Musashi University Sociology Faculty)
Toshiya Jitsuzumi (Professor at the Chuo University Faculty of Policy Studies)
Hideki Sunahara (Professor at the Keio University Graduate School of Media Design)
Masahiro Sogabe (Professor at the Kyoto University Graduate School of Law)
Yasuhiko Taniwaki(Advisor at the You Go Lab)
Shinji Terada (Executive chief researcher at JIPDEC)
Osamu Nakamura (Professor at the Keio University Faculty of Environment and Information Studies)
Ryosuke Nishida (Associate professor at the Institute for Liberal Arts, Tokyo Institute of Technology)
Shuya Hayashi (Professor at the Nagoya University Graduate School of Law)
Fumihiro Murakami(Chief Researcher at the Mitsubishi Research Institute Co., Ltd. Digital Innovation Division)
Toshiya Wannabe (Professor, executive, vice-dean, and vice-director at the Institute for Future Initiatives at Tokyo University)

<Contact Information>
Digital Policy Forum Secretariat
dpfj@yougolab.jp