Information Literacy vs. Fake News: The Case of Ukraine

Abstract We profile the successful Learn to Discern information literacy program, developed in Ukraine in 2015 and now being adapted to the needs of other countries. Drawing on published documents, interviews, and the personal knowledge of one the initiative’s designers we situate this work as a response to the particular challenges of the Ukrainian information environment following Russia’s hybrid offensive which begun in 2014 with its aggressive deployment of propaganda and so-called “fake news.” We argue that the Learn to Discern program was a coming together of three formerly separate strands: a focus on the development of modern library infrastructure, a distinctive Ukrainian model of information and media literacy, and the hands-on debunking of misinformation performed by the StopFake group.


Introduction and Methodology
This paper provides a contextualized description of the Learn to Discern information literacy program administered in Ukraine in 2015-16. We focus particularly on the connection of this program to earlier efforts to fight "fake news" in Ukraine, and to longer running efforts to develop information literacy and library infrastructure in that country. It builds on our own earlier study (Haigh, Haigh, & Kozak, 2017) describing initial attempts to fight a flood of misinformation produced by Russia in 2014 by the volunteer journalist group StopFake, one of the key contributors to the new program. The new paper considers a promising longer term effort to address both the specific problem of fake news stories and the broader issues of media bias and distrust that characterize Ukraine and, increasingly, the United States and other developed countries. We conducted fieldwork in Kyiv and Lviv in 2018, interviewing local media experts and participants in various initiatives. One of those interview subjects, Tetiana Matychak, provided additional in-depth information on the Learn to Discern program and we included her as an author of this paper. Where possible, we have sourced or confirmed all points in publicly available documents, but some of the information presented here is based on her personal knowledge.
as 2017 word of the year (Segara, 2017). The Oxford Dictionary nominated "Post-Truth" as the 2016 word of the year (Wang, 2016). U.S. society is only beginning to grasp how and what really happened with the immense spread of disinformation that influenced opinions of millions of people. Investigations continue to demonstrate how unprepared social media companies were to resist the spread of misinformation and fake news. (Confessore, Dance, Harris, & Hansen, 2018); (Frenkel, Casey, & Mozur, 2018)).
We define fake news here specifically as something that pretends to be a news story but is not the result of an actual reporting process. A real news story is the result of a process in which information is gathered and used to write a report. A fake news story has the form of a real news story but not the substance. What makes it fake is not that it is untrue, because actual reporting can be mistaken or biased or sloppy. Fake news is not badly reported news, any more than a fake ruby is a kind of ruby. At the most basic level, it may misidentify the country in which a photograph is taken, fabricate or misattribute quotations, or describe events that the author has made no attempt to corroborate. The presentation of fake news mimics the form of a real news story -for example by appearing on a website whose name, layout, writing style, etc. has a superficial resemblance to those of sites holding actual reporting. Much fake news is deliberately produced as propaganda, but fake news could be created by someone who believes it to be true. We have developed this point more extensively in other work (Haigh & Haigh, forthcoming;. Although the term "fake news" is new, and newly ubiquitous, many of the phenomena to which it is applied are much older. A recent book by Kevin Young "Bunk: The Rise of Hoaxes, Humbug, Plagiarists, Phonies, Post-Facts, and Fake News" traced fake news in the United States back to the 17 th century. Young shows that hoaxes and false claims had never been random, but instead intended to stir up worst instincts and prejudices in various societal groups (Young, 2017). The concept of fake news is not new in the rest of the world either (Burkhardt, 2017). Studies in psychology show that disinformation is much more effective when its message coincides with people's original beliefs (Chan, Jones, Hall Jamieson, & Albarracín, 2017) .
The notion of fake news is connected to the concepts of misinformation and disinformation, indeed some have argued that disinformation is a better term (Waldrop, 2017;Wardle, 2017). Cooke looks at misinformation and disinformation as two sides of the same coin. Misinformation is defined as incomplete and vague information, while disinformation is defined as purposefully distributed false information (Cooke, 2017). Disinformation is a real threat to global stability and democratic way of life. Princeton historian Daniel Rogers attributes the effectiveness of disinformation to the fracturing of the society due to societal focus on individual behavior and individual exchanges (Rodgers, 2011).

Fake News and Information Science
Information science researchers have also begun to contribute to this burgeoning literature. Studies of information behavior and information literacy have emerged as particularly important area in the environment of intentionally produced misinformation (Marwick & Lewis, 2017;Rosenwald, 2017;Waldrop, 2017).
Recently, libraries, with their educational and social mission and focus on information literacy, have become more visible in the struggle against fake news. Librarians promoted the need to critically evaluate information resources long before the notion of fake news became so prevalent in everyday life (Hobbs & Jensen, 2009;Thomas, Crow, & Franklin, 2011). Restoring trust in legitimate information sources is one of the crucial strategies in fighting disinformation. Information and media literacy and role of libraries in this resistance cannot be overstated (Cooke, 2017). Information professionals are well equipped to deal with false claims and educate the public with information literacy tools (Shenton & Hay-Gibson, 2011). Librarians actively employ methods to assess credibility of information.

Ukraine's Information Infrastructure
Before we turn to the specific challenges of fake news in Ukraine, and the role of libraries and information literacy programs in combatting it, we must first say a little about Ukraine and the state of its information infrastructure prior to Russia's fake news offensive of 2014. This has developed unevenly, reflecting the continuing influence of Ukraine's Soviet past as well as periodic swings from Russia towards the West and back again during its first two decades after it gained independence during the collapse of the Soviet Union in 1991. Ukraine went through the shock of two popular uprising in 2004 and in 2013, as its journey towards stable democracy proved far from direct. Mass protests in 2004 overturned the results of the fraud-ridden presidential elections. This "Orange Revolution" led to the election of the pro-Western president. Ten years later, people again took to the streets during the "Revolution of Dignity," against corruption and broken promises of closer integration with European Union. As a result, the Kremlin-backed president Viktor Yanukovych fled to Russia in early 2014.

Library Programs
Libraries formed a crucial part of Ukrainian information education in both the Soviet and Post Soviet periods. Ukraine inherited an expansive library network from its Soviet past. During the Soviet period, libraries played a crucial part of the ideological infrastructure of the Soviet society (Haigh, 2007(Haigh, , 2009. From very early stages of the Soviet Union, Lenin's wife Krupskaya, promoted the collectivization of books confiscated from the private collections. The collectivization of books from library collections also facilitated Communist party control over their contents. From the very beginning Krupskaya stressed that only certain books should be made available to the masses. In 1924 she wrote: ''There are books that organize and there are books that disorganize'' (Likhtenshtein, 1978). As Lenin's wife, she established the principle of cleaning these ''disorganizing'' books from library collections (Verzhbizkiy, 1924). Throughout the Soviet period, librarians were responsible for maintaining up-to-date lists of forbidden works and removing them from public view. As Soviet rule spread to new states, librarians were trained in these techniques and required to move unsuitable books into ''special collections'' (Abramov, 1974).
After the collapse of the Soviet Union, the Ukrainian government instituted sweeping reforms in libraries by opening many collections and reorganizing library operations with the support of foreign organizations such as IREX (International Research and Exchange) and USAID. The IREX initiative, Bibliomist, was funded by the Gates Foundation. By the time it finished in 2012, its Ukrainian program had trained more than 4,000 librarians with updated computer skills and modernized close to two thousand libraries with new technology and publicly available Internet service (IREX, 2018a (Klinenberg, 2018, p. 37&46).
The Bibliomist program was explicitly intended to strengthen government support for libraries and to build the capabilities of the Ukrainian Library Association, to make these changes sustainable. As we will see below, these programs provided important infrastructure for subsequent information literacy initiatives. During this transition, the role of libraries in Ukraine shifted from ideological enforcement to social infrastructure. According to the Ukrainian government statistics, the Ukrainian library network currently includes over 40 thousand public, academic, and specialized libraries. The government sees libraries as centers for promotion of democracy and education (Yatsenyuk, 2016).

Early Information Literacy Programs
Information and media literacy programs in Ukraine prior to 2014 took place parallel to, but separate from, these efforts to develop library infrastructure as a democratizing force. One of the most visible groups in this area was the Academy of Ukrainian Press (AUP), an affiliate of the international non-profit group Internews, which sponsored and coordinated the development of information literacy programs alongside other efforts to encourage journalistic professionalism and investigative reporting. AUP had a close academic alignment with international work on information literacy, which it combined with a mission to document and challenge concentration of media ownership and press bias in Ukraine (for example in (Ivanov et al., 2011). The group was associated with Kiev National University, and in particular with Professor Valery Ivanov. Another Internews affiliate, Detector Media, focused on media monitoring to identify signs of biased reporting and hidden influence in Ukrainian journalism.
Another important source of interest in media literacy within Ukraine was the Institute of Social and Political Psychology, whose deputy director Dr. Lyubov Naydonova took an interest in the challenges of media education. In 2011 the group started five year pilot program with participation of 120 schools, offering an elective course in media literacy (Dorosh, 2016). The program aimed to prepare teachers to instruct information and media literacy skills in high schools as an elective course. Implementation and testing of the information and media literacy was quite limited due to the lack of funding from the government or NGOs. An experiment was conducted in 120 schools offering courses as an elective, but as a pilot program its direct influence was limited.
This information media literacy curriculum focused on proficiency in technology usage and searching for information. It consisted of four interconnected parts, following a model developed by Naydonova: "the reaction (search for information, reading/scanning, identification/recognition of media texts), actualization (assimilation, integration of new knowledge related to the media), generation (incubation, the creative conversion, the transformation of media knowledge and skills), and use (transfer of information, innovation, research in the field of media)" (Silverblatt, 2013, p. 942).

Fake News in Ukraine: the Challenge of 2014
In 2014, dramatic events exposed the inadequacy of previous approaches to information literacy and led to a reorientation of IREX programs in Ukraine. Russia responded promptly and brutally to the fall of Viktor Yanukovych, annexing the Crimean peninsula from Ukraine and fomented military conflict in the Eastern part of Ukraine in Donetsk and Luhansk.
Russia's offensive was waged as much on television and the Internet as on the battlefield. (Rushing, 2017). This was the first large-scale international deployment of Russia's information warfare capabilities later deployed in the 2016 US presidential elections. In 2013 General Valery Gerasimov, Russia's first defense deputy, outlined a new Russian military doctrine which combined methods of military and information domination (Bartles, 2016)1.
One of the strategies of this "hybrid warfare" is information domination, spreading an overwhelming torrent of disinformation messages across different media. At that point Russian television channels were widely viewed inside Ukraine and Ukrainians favored the Russian social media platform VKontakte rather than Facebook. Ukrainian and international observers quickly noticed patterns of apparent coordination, in which the same items of disinformation would appear suddenly in different media, including statecontrolled domestic and international television channels (such as The First Channel, World Wide and Russia Today), nominally independent but Kremlin affiliated television outlets (such as NTV and Life News), sensationalist internet news sites, and thousands of social media accounts supposedly belonging to people in different countries (Cottiero, Kucharski, Olimpieva, & Orttung, 2015;Davis, 2014;Jonsson & Seely, 2015;Pomerantsev, 2014Pomerantsev, , 2015Silverman, 2015;Snyder, 2014). Subsequent investigation revealed the involvement of the Internet Research Agency, a massive "troll farm" based in St Petersburg, where employees were expected to register accounts with fake identities and improvise hundreds of daily posts around assigned talking points (Chen, 2015). This organization became much better known later, after its operatives were charged by the Department of Justice with conspiracy to influence the outcome of the 2016 US elections.
The false stories, many of them strikingly amateurish in production or outlandish in content, included Nazis roaming the streets of Ukrainian cities, children crucified by the Ukrainian army in front of their mothers, photoshopped images of fake war atrocities committed by Ukrainian army, excerpts from the Soviet era World War II films presented as current day realities. Timothy Snyder, Yale historian, explains how Russia exploited new technologies of social media platforms with new techniques of hybrid war, undermining existence of facts. This strategy left Americans and Europeans completely unprepared. Snyder identified a new style of "implausible deniability": "According to Russian propaganda, Ukrainian society was full of nationalists but not a nation; the Ukrainian state was repressive but did not exist; Russians were forced to speak Ukrainian though there was no such language." .
Ukraine struggled to respond to this disinformation offensive. One of the most visible initial responses was the work of the volunteer journalist group StopFake, which rose to international prominence soon after its founding in 2014 (Bonch-Osmolovskaya, 2015; Khaldarova & Pantti, 2016;UkraineToday, 2015). StopFake combined elements of counterpropaganda work, political fact checking of the kind performed in the US by PolitiFact, and myth busting of the kind associated with Snopes.com. Its fact checkers, translators, and editors collaborated online to select dubious-seeming pieces of news, refute them using media forensics, and publish the resulting pieces of analysis in several languages (primarily English and Russian). An analysis of the group's early work showed that the techniques it used against fake news were more straightforward and less dependent on neutral experts than those used in traditional fact checking work, for example showing that a picture in a report had been manipulated from its original appearance or misleadingly captioned (for example as showing devastation in Ukraine, rather than Syria) (Haigh et al., 2017). StopFake was widely read in Russia and Ukraine, and its work featured prominently in reports on the conflict written by Western journalists.
StopFake and other counterpropaganda efforts helped to curtail the spread of some propaganda narratives into international reporting, but fact checking has some inherent limitations against the spread of misinformation, particularly via social media. Carefully produced responses take several days to research and verify, by which time the original fake news has been widely shared. Fake stories have been engineered to create an emotional response and spread virally, whereas an effective rebuttal must be careful and rational and so it less likely to be widely shared. StopFake hoped to teach information literacy to its readers, both through the careful explanations provided in its postings and through instructions on the use of features like Google's image search. Yet such efforts have inherent limitations. The fake story and the rebuttal are likely to be seen by different audiences, as anyone visiting a news debunking site already probably has a high degree of engagement with information literacy issues. The effectiveness of debunking tends to drop over time: once a hundred stories from a particular media outlet have been refuted, those still inclined to trust it are unlikely to have their minds changed by debunking a hundred more.
By 2015 Russia's initial dramatic attacks on Ukraine's physical integrity and information sphere had given way to the stasis of a carefully managed "frozen conflict" of the kind engineered by Russia within several of its neighboring states (also including Georgia, Armenia, and Moldova). Its armed forces intervened to halt Ukrainian advances in Eastern Ukraine and force acceptance of a cease fire line, leaving two of the largest Ukrainian cities outside government control. The information war likewise shifted from crisis mode to chronic problem, as Russian media shifted from easily debunked fake stories to a subtler campaign highlighting and exaggerating bad news about Ukraine to drive home the idea of Ukraine as a failed state (Kuzio & D'Anieri, 2018;Shutow, 2014) . One governmental response was to curtail the dissemination of many Russian television channels within Ukraine. Another, in 2017, was to block access to the Russian social network VKontakte, which was used extensively by trolls and since the expulsion of its founder Pavel Durov in 2014, reportedly following his unwillingness to disclose information on Ukrainian protestors, has been owned and controlled by forces closely aligned with the Kremlin. These moves were controversial, criticized by some international groups such as Human Rights Watch and Reporters Without Borders as infringements on freedom of expression (Miller, 2017;Reuters, 2014). Many Ukrainian media experts we talked to defended the moves as a response to heavily biased foreign media during an unresolved military conflict.
That debate falls outside the scope of this article. Instead we are focused on another kind of long term response to the severe challenges faced by Ukraine which brings together all three of the threads introduced above: a media literacy program focused on the particular challenges of fake news and offered in libraries across the country.

Learn to Discern -Libraries and Information Literacy Come Together
The events of 2014, according to our informants, made it easier for groups in Ukraine to obtain international funding for information literacy projects., Educators and policy makers realized that if information consumers would have higher level of critical thinking, then propaganda would be less effective. Sources of support included USAID, Internews Network, Deutsche Welle, MATRA (Netherlands), and the Canadian, American, British and German Embassies (Zolotukhin, 2018) Libraries became important centers for the information literacy programs for citizens. The ministry of education proposed integration of information literacy programs in the curriculum in multiple grades, starting in the elementary school level. The goal of these programs is to train students to harness knowledge and habits that will allow to create and critically evaluate information content. (Naydonova, 2015).
Fake news, which we define as stories whose main claims are flat-out fabricated, is still a problem in Ukraine, as in other countries. But so is biased and misleading reporting. While Russian generated material, including troll factory, is still identified as a problem by Ukrainian media experts so are more subtle kinds of manipulation. As in many countries, media ownership in Ukraine is highly concentrated and political groups, local and national, ally themselves with the interest of particular wealthy businessmen. This produces highly biased reporting. While some of these groups may ultimately be funded or supported from Russia, biased media poses the same challenge whether it serves domestic or foreign interests. This shift prompted a convergence for interests between participants whose backgrounds were in fighting fake news, primarily veterans of StopFake, and those active in more traditional media literacy programs focused on subtler kinds of bias. This alliance reflected the particular, and rather extreme, information environment faced by Ukrainians.
The project "Learn to Discern," originally called "Citizen Media Literacy," was started in February 2015 as a project of IREX, with collaboration of the Association of Ukrainian Press, and StopFake. English language name was changed later to a snappier L2D, but stayed with the same name in Ukrainian. It was led by Mehri Druckman, IREX's director of programs in Kyiv. Financial support was provided by the Ministry of Foreign Affairs and Canadian Department of Foreign Affairs, Trade, and Development (DFATD) through its Global Peace and Security Fund. IREX recognized that the new, and urgent challenges, justified a shifting of emphasis from the building of library infrastructure to the particular challenges of fake news and media bias. The goal was to strengthen citizens' resilience in the face of destabilizing misinformation about the current conflict in Ukraine. The strategy of the program was to improve public information and media literacy through training and public campaigns encouraging informed media consumption.
Prior to the beginning of the new program, IREX conducted a survey in 14 regions of Ukraine among the population aged 18-70 to find out how media literate people are. The survey showed that more than 50% of the respondents were not able to distinguish fake new stories, blindly trusting the information received from the media. Also, most respondents did not check the information they received, especially if they came from their acquaintances (IREX, 2018b; Susman-Peña & Vogt, 2017).
The stated goal of the program was to develop the ability of Ukrainian news consumers to discern the reliability of sources when presented with conflicting information from Ukrainian and Russian media; in particular, reporting that refers to the conflict, government reforms and status of "IDP"s (internally displaced persons) The financial resources available to meet these ambitious goals were quite constrained. In the end, about $600,000 was raised . That is not much for a public education initiative on a national scale. Academic experts routinely teach, but their courses reach only a small number of students and even the most popular public speakers connect with only a small part of the population. The project's goal was to deliver high quality information literacy training, incorporating academic insights and proven pedagogical practices, across the country. That required a separation of roles, between a small group designing the new program and a large group of volunteers to deliver the new information literacy training in a form that could be understood by ordinary citizens during a jargon-free one-day training session.
The first step was assembling a steering committee of 15 experts in media literacy, fake news, and counter propaganda work. The committee was briefed on the objectives and planned activities for the program. It was charged with the design of media literacy curriculum, to be documented in a textbook issued to the instructors who would run public training sessions. Committee members contributed material related to their particular areas of expertise. This pulled together all the formerly separate strands mentioned earlier: the focus on media bias and ownership of the Internews affiliates, StopFake's hands-on work to develop techniques to identify and rebut fake news, and Naydonova's model of media literacy. The work of the committee was also influenced by Renee Hobbs and her colleagues in the United States, who pioneered (Hobbs & Frost, 2003;Hobbs & Jensen, 2009) the modern concept of media literacy. Hobbes now leads the Media Education Lab at the University of Rhode Island.
Following a two day meeting in July, the textbook was completed in October 2015. The final version is 193 pages long and contains many annotated examples of fake news, propaganda, and biased reporting taken from Russian and Ukrainian media. The focus was on mass media, rather than social media as newspapers and television were still seen as the main news media for most Ukrainians. It is presented clearly and without jargon or academic theory, in ways designed to be as assessible as possible to the life of ordinary Ukrainian. It was designed to follow the structure of a training session, being split into three parts with instructions specifying how much class time to spend on each one. "The first chapter provides participants with a basic understanding of information and propaganda. It gives an overview of the types of mass media, their work, objectivity, and media ownership. The second chapter covers manipulation, fake news, and propaganda and their dangers. Through practical exercises, participants gain experience analyzing media content (headlines, texts, pictures, and videos) using debunking tools and identifying markers of fakes, manipulation, and propaganda. The third chapter explores the consequences of dehumanization, stereotypes, and hate speech in the media. All material is written in a simple, easy-to-understand way and contains numerous examples, exercises, and handouts that help participants not only to learn, but also to share the information with friends and relatives." (IREX, AUP, & StopFake, 2015)

Training the Instructors
Delivering the new program designed by the steering committee involved a three-tier pyramid of training. Members of the public would learn from instructors, who themselves were trained by members of a group known within the project as "master trainers." The master trainers were selected to represented different regions of Ukraine: Kyiv, Rivne, Odessa, Vinnytsia, Donbass. All of them were professional media workers with experience in teaching or running training sessions. Among them were journalism teachers at universities, communications specialists and project managers in public organizations. In October, 2015, these 14 trainers, men and women, were gathered for three days near Kyiv. They received training from two members of the steering committee that had produced the textbook. These committee members were Diana Dutsik, who at that time worked as the executive director of the public organization "Detector Media," and Tetiana Matychak (one of the authors of this article). To work effectively with the new material, the master trainers had to unlearn some of the things they took for granted in other educational settings. For example, those used to training journalists were accustomed to using academic language and theory, a contrast to the focus of the new course on practical media skills for members of the public.
The master trainers were grouped into pairs. Each pair was responsible for running two training sessions for information literacy instructors, during October or November 2015. Each of the 14 training sessions was held in a different region of central or eastern Ukraine. The master trainers were assigned regions with which they were already familiar, and expected to select examples and methods from those offered in the manual that would resonate with the local population. They were also encouraged to look for more recent examples, to maintain the topicality of the sessions as disinformation tactics evolved.
Altogether, these sessions trained 428 information literacy instructors (AUP, 2015). They worked on a volunteer basis. Among them were librarians, public figures, teachers, and university professors. Each agreed to hold at least one media education training for the population over the next six months. They each received a paper textbook and an electronic version of the textbook. The choice of Ukrainian or Russian materials depended on what language was more widespread in the region where the instructor worked.
According to an IREX report, "By choosing trainers who were influential in their social and professional networks, training participants were more likely to trust what they learned about propaganda and disinformation than they would information from a stranger." (Murrock, Amulya, Druckman, & Liubyva, 2018)

Training the Public
The program ran for a nine-month period, spanning 2016 and 2017. It trained citizens from across Ukrainian society, including high school teachers and students, professional union members, medical workers, police officers, and library patrons Altogether, around 15,000 people participated in the one day sessions, held in public venues. Much of the project's budget went to advertise these events and encourage participation. According to IREX, "We conducted a highly visible public media literacy campaign with public service announcements, billboards, print materials, and social media, in Russian and Ukrainian, to build awareness about disinformation, amplify the effect of the trainings, and reach a wider audience." (IREX, 2018b) There was no money to pay for meeting space to hold training sessions. Some took place in rooms provided by schools, universities, or other public organizations. Libraries were particularly important as hosts. IREX had a strong relationship with Ukrainian libraries, having equipped libraries across the country with computers and internet access in its Bibliomist program. This provided suitable spaces in parts of the country where the facilities needed to administer the program, including internet access and digital projectors, might not otherwise be easily accessible. Libraries welcomed the influx of participants, who might not be regular visitors, and were particularly appropriate sites for some of the populations most vulnerable to propaganda messages, including veterans, low income families and "internally displaced persons" (refugees from the Crimea and separatist held territory in Eastern Ukraine).
Instructors were asked to engage participants in a participatory dialog, rather than to give a traditional lecture. Methods used during the training included: mini-lectures, presentations, group work, practical individual exercises, discussions, brainstorming, and visualization. An important element of all training sessions was the demonstration of all examples of life: fake , propaganda, language of hostility. This allowed the audience to be conveyed by facts, rather than by theoretical examples from textbooks. The instructors were expected to appear impartial and to oppose propaganda whether it served Russian or Ukrainian interests within the conflict. They had to be able to control their own emotions, and read the mood of the group being trained. Because the training sessions were offered in different parts of the country, participants might have strong feelings of identification with the Ukrainian government or emotional attachment to Russian narratives. If they sensed strong emotional reactions from participants they could shift to other examples, including media samples without direct connection to the conflict.
In general, the program consisted of three parts, following the structure of the textbook. Section 1,"Information and mass media," was about media influence, propaganda and methods of its recognition. Section 2, "Manipulation and struggle, " showed examples of fake news stories and ways to refute them. Section 3, "Language hate,"-discussed the hostility and explained why it should be avoided. For each of these, the textbook provided examples of exercises, presentations or discussions. The descriptions for each activity provided information on time, materials needed, and goals.
The overall learning outcomes (aims) were aimed at ensuring that trainees could: -Understand the basic principles of media work, analyze their media space and consciously approach media consumption issues, -Understand the mechanisms of media influence and proceed to selective perception of information, -Learn to more clearly identify propaganda, fake media and manipulations in the media, so as not to fall under their influence, -Develop critical thinking and analyse information coming from different types of media, -Learn about the mechanisms and consequences of using stereotypes and hate speech in the media, learn to recognize this type of manipulation.
During the training sessions students frequently sked the following questions: -Who owns Ukrainian TV channels, newspapers and websites ? -How do journalists work? -How to distinguish fact from judgments? -What are the simplest ways to check fakes especially for older people? -How to convince an opponent that he is mistaken and to bring actual information to him on a particular issue? -How to become psychologically resistant to fake information that appear in the media every day? -Where can I get more information about media literacy? .

Impact and Follow On
One of the distinctive features of the program was the efforts made by IREX to verify its effectiveness. Because of the tight timeline, the textbook exercises were not experimentally validated before the large scale training took place. In fact the book included a notice that "This curriculum is a work in progress and will continue to be updated in response to feedback from CMLP trainers and participants" (IREX et al., 2015).
Instead, evaluation was performed to determine how successful the program as a whole had been in improving the information literacy skills of its participants. We already mentioned the baseline survey performed by IREX on Ukrainians who had not received training. IREX followed this one month after training with a survey. The main validation of the training program took place eighteen months after the initial training, to determine its lasting effect. IREX administered the same test to two large groups (200 people each), one of people who had been trained and one to an otherwise similar control group. The study (Murrock et al., 2018) determined that people who had been trained were more alert to possible misinformation and were more aware of media ownership issues. Eighty two percent of participants reported habitually cross-checking claims found in news reports. They also felt more confident than the control group in their ability to distinguish fake news from genuine reporting, which made them more inclined to trust news media as a whole -an important result as disinformation works be eroding trust in the possibility of knowing the truth. According to a report in Slate.com, Renee Hobbs called this assessment exercise "groundbreaking" as a new model for how to "measure media literacy competencies acquired by adults though formal media education programs." (Guernsey, 2018) IREX claims an impact on Ukrainian information literacy beyond the groups who directly attended the training sessions. The program empowered citizens to teach each other. According to an online summary, 91% of participants reported sharing their new knowledge and skills with an average of 6 friends, relatives, or colleagues each, reaching 90,000 Ukrainians in total by the end of the program (IREX, 2018b). Although the official program is over, many librarians trained in information literacy continue to offer free training sessions in the local libraries.
When we interviewed Mehri Druckman in August, 2018 she explained that IREX was also working to apply the program in other countries in which it was already active, such as Jordan, and was seeking funding to extend this work in the post-Soviet states of Georgia, Lithuania, Latvia, and Estonia. The success of the Learn to Discern program led IREX to brand a new initiative, building on Lyubov Naidenova previous efforts, as "Learn to Discern in Schools" (Dorosh, 2016). IREX received grants from the US Embassy and the British Embassy for this project, which is being implemented jointly with the Ministry of Education and Science of Ukraine, as well as in partnership with AUP and Stopfake. In this school year, it was attended by 50 schools: Chernihiv, Dnipro, Mariupol, and Ternopil. Media literacy is introduced into such subjects as art (Grade 9), Ukrainian language and literature (grade 8), history of Ukraine and world history (Grade 8). If the project is successful, it will be distributed throughout the territory of Ukraine. Some material from the Learn to Discern training handbook is being incorporated into the educational material for teacher (Malinka, 2018).
Whereas the Learn to Discern program was focused primarily on traditional mass media, IREX has also been exploring education focused on the specific challenges of social media. A 2018 program, "Care before share" recognizes that in an age of social media ordinary citizens are effectively serving as news outlets, or media gatekeepers, for their family and friends. It encourages them to carry out this task more self-consciously.

Conclusions
Grounded in the particular challenges of the Ukrainian environment, the Learn to Discern program combined elements of traditional information and media literacy education with a distinctive emphasis on the detection of propaganda and outright fake news. Back in 2015, when the program was being designed, the problem of "fake news" was little discussed in America and had even the phrase had a strange, unfamiliar ring to most native English speakers.
That all changed in late 2016. Today the Ukrainian media landscape seems more recognizable to Westerners, more extreme in some of its challenges but stalked by the same issues of political bias, foreign agents, bot nets, online conspiracy theories and paid trolls that have dominated so much recent analysis of contemporary American media practices. Looking to Ukraine, and in particular to the Learn to Discern program, for possible lessons no longer seems as outlandish as it would have done in 2015.From the Director of IREX Ukraine, Mehri Druckman, we learned that IREX was working in 2018 to pilot a version of the Learn to Discern program in the United States, in partnership with local universities and civil society groups. The instruction manual had already been translated into English (and is available online). Pilot programs were planned for Arizona and New Jersey. "Librarians around the U.S. are also contacting IREX to find out how to roll it out in their communities" (Guernsey, 2018). Having suffered the heaviest onslaught of fake news, the tools developed by Ukraine in response may prove useful to other countries facing similar internal and external threats to their democratic frameworks.
Fake news will never be entirely beaten. It has a long history, and its spread is driven by fundamental human tendencies to share shocking information and to prefer information that conforms with their existing beliefs. Technological changes are making fake news easier to produce and distribute than ever before, which make it attractive to profit-seeking entrepreneurs and military strategists alike. Yet those challenges make the modest but measurable accomplishments of programs such as Learn to Discern even more important. Libraries, like responsible news reporting, are losing their hold on the public in what is often called a "post truth" era. Anything that helps to rebuild trust in reliable information sources and help the public distinguish real news from fake has a part to play bolster democracy and strengthen civil society.