CODE HER TRAINERS


 

Stage 0

Brief Video

Brief Introduction

Code Her is a data-driven journalism (DDJ) course designed and delivered by a group of professional data journalism specialist, journalists, data engineers, and designers to cover seven different topics: journalism, storytelling, data pipeline, ddj in teams, artificial intelligence and data security

تمكنت أساسيات صحافة البيانات الصحفيين من الوصول إلى مجموعة موسعة من المصادر والتصرف بها واستخدامها بفعالية في عملهم. البيانات تحكي قصة أيضًا. إنها تجعل من الممكن دعم القصص بالبيانات وتوضيحها بالرسومات. هذا يؤدي إلى فهم أفضل وأعمق للقارئ ويدعم العمل الصحفي. قامت كود-هير بتدريب مجموعة 26 شباب صحفيين بيانات في منطقة الشرق الأوسط وشمال إفريقيا وعلمتهم أساسيات صحافة البيانات. وقد مكنهم ذلك من اكتشاف مسارات صحفية جديدة لأنفسهم بنجاح وتحقيق نجاحات جديدة. محتويات ونتائج ورش العمل متاحة للأطراف المهتمة على هذا الموقع. يمكنك أيضًا اكتشاف عالم البيانات معنا.

What to expect from this journey?

 

During this journey, you will learn how to use data engineering steps and techniques along with storytelling, and other important topics to apply in your journalistic work

You will also learn how to build interactive data-driven reports using advanced tools and coding methodologies

 


 

Stage 1

Intro Material

Introduction to Data Journalism

What is data journalism, how did it start, and what is the current landscape in the MENA region? All those questions are answered in this section

Journalism and New Media
Journalism is not something new, nor Media! However, with new technologies that are shaping the media industry, new journalistic principals and sciences were found. One among those new sciences is Data Journalism!
Journalism VS Data Journalism
Check Attachment 2
Best Case Scenarios of Data Journalism
Check Attachment 3

Congrats!

    • You have completed10%

     
     
    Noha Belaid

    Holder of a doctorate in Media and Communication, and a short MBA in Economic Diplomacy and Lobbying. Expert in Media and Communication, Local Governance and Leadership

     
     
    Rama Jarmakani

     

    Journalist and Project Manager at DW Akademie

     

     

    Stage 2

    Intro to Data

    Introduction to Data Science

    The science of data or the Data Science is not a modern or a new science. Datahas been always part of research and studies since the 17th century, however, with the current industrial revolution, data is more accessible, producible and valuable for the current market needs. In fact, this revolution made Data Science as one of the hottest and most important and valuable sciences when integrated with other domains.

    What is Data
    Data can be any atomic piece of information, that if combined and analyzed can give a certain insight -- Check the handout
    Introduction to Data Science
    Data Science can be described as an intersection between three sciences: 1. Mathematics and Statistics 2. Computer Science and Programming 3. Domain Knowledge Each one of the three mentioned sciences is a key important dimension, however, domain knowledge is the most critical one.
    Data Science in Journalism
    When it comes to Data Science in different industries, there are some new domains were both data and domain sciences are combined together. Examples: 1. Data Journalism (DDJ) is an another title of "Data Science in Journalism" 2. Social Data Science is an another title of "Data Science in Social Life (Economy, History..)"
    Data Journalism vs Data Engineering
    As we described before, Data Journalism is the combination of both Data and Journalistic Sciences. However, Data Engineering is the pure data pipeline procedure that aims at interpreting and investigating data regardless about the domain knowledge. In Data Journalism practice, a data journalist is not expected to have a deep knowledge about data engineering, however, a lighter know-how about the data pipeline is required. In Data Engineering practice: a Data Engineer could be highly skilled to deal with large amount of data and complex analysis approaches with coding and cloud engineering knowledge. Things in common? Both are using the exact same pipeline!
    Data Driven Reports
    Reports that are driven by data (contains analytical charts and probably interactive visualizations) Data in data driven reports are said to be facts that support the author's hypothesis.

    Important Note!

    60%

    of the Dashboards you see are not DDJ!


    Congrats!

      • You have completed15%

       
       
      ALI SROUR (Alexsai)

      ALI is a Data Scientist, an AI research developer, a Trainer, and the head of research at SocialLab Academy of AI and Data Sciences

       

      Stage 3

      Data Pipeline

      Data Pipeline

      In this stage, you will learn with ALI and Kassim the whole Data Pipeline. From Selection and Collection to analysis and visualization, step by step.

      Data Pipeline Steps

      • Data Selection
      • Data Collection
      • Data Preparation
      • Data Analysis
      • Data Visualization
       
       
       
      Kassem Shehady

       

      Kassem is a technical manager, data engineer, and a cloud solution architect

       

       

      Data Selection

      Topic and Data source selection is the entry point to any data-driven activity. In other words, to start your data-driven report or research, you need to start by selecting your story and your data source. The more accurate you select data sources and stories, the easier it will be for you during the implementation of the next pipeline steps

      Topic Selection
      To start, you need to classify your work based on its topic or category. For instance, you might be interested in speaking about refugees. In this case refugees will be the main topic. Then you can later dig deeper into the angle which you are trying to discuss within your data-driven work, based on the data that you have.
      Data Source Selection
      After selecting the topic, it will be easier to start looking for a data source that is "likely" to provide data about the pre selected topic.
      Data Source Validation
      To validate your data source, you should look at two main things. 1) whether the data source is credible- which means that the source is capable of providing data about the pre selected topic or its theme. 2) accurate- which means that the source is accurate and trust worthy in regards with the audience of your data driven activity/report.
      The five W's
      What (Topic), When(Time frame), Where (Location, Region), Which(Data Source), Why (This Data Source)

      Congrats!

        • You have completed20%
         

        Data Collection

        After choosing the topic and selecting the data source(s), it is time to collect the needed datasets. But here comes the question, can we extract data from the selected data source? Check out the following sub topics after watching the video

        Collection Methods
        There are several methods of collecting data from predefined data sources. 1) download 2) on demand/request/API 3) crawl/scrap 4) interpret
        Collection Techniques
        Check handout 1
        Crawling vs Scrapping
        Crawling and scrapping are among the top collection mediums when data is not freely accessible for users (in case the data source does not have open data features).
        Collection with Coding
        Mainly, crawling and scrapping can be always done by coding, so developer create web script to crawl structured data from websites based on reading the source codes. In addition to that, calling web interfaces using API might also be handled through coding. Check out the Video below

        Congrats!

          • You have completed30%
           

          Data Preparation

          After having the datasets being collected and stored. It is time to prepare the data for analysis and visualization. Data preparation includes cleaning and filtering, which are usually the most important tasks in the whole data pipeline procedure. Usually, collected data is not ready to be analyzed or visualized due to additional data points and not angle-oriented attributes. Data engineers/data journalists usually spend the majority of their data-driven activities in the Preparation Step. Watch the following short video!

          Cleaning vs Preparation
          It is important to distinguish between cleaning and preparation. Cleaning means to clean the data from all unnecessary, unwanted, mistaken, inaccurate, and non-readable data. Preparation means to prepare your cleaned datasets for the analysis step where they need to be matching the story questions.
          Cleaning Steps
          Copy, Replace, Delete, Refine, Format, and Save are the main steps of the Cleaning pipeline step. (check the Handout)
          Cleaning tools
          Google Sheets, Ms Excel, Open Refine are some great tools to filter, edit, and clean your datasets.
          Cleaning with Coding
          The power of coding in data preparation is practically more into data engineering purposes for big datasets than in Data journalism practice. However, there are a lot of opportunities that codes can unlock for data preparation.

          Congrats!

            • You have completed40%
             

            Data Analysis

            After preparing and cleaning datasets, It is time to analyze and investigate the data that we have, to find answers to all the questions raised by our story. Meaning that if a data journalist needs to answer/prove two questions/points in his/her data-driven work, he/she needs to find out those answers in the data first. Watch the video!

            Analysis Level
            Descriptive Analytics, Diagnostic Analysis, Predictive Analysis, Prescriptive Analysis,
            How things can be analysed?!
            Check out the following handout link

            Congrats!

              • You have completed55%
               

              Storytelling

              Storytelling is the science of telling a story! Check out Ghenwa's video about the topic!

              Congrats!

                • You have completed60%
                 
                 
                Erik Tuckow

                Freelance Expert in Data Visualization, Illustration, and Infographic Design

                 
                 
                Ghenwa Abou Fayyad (noiŕe)

                 

                Freelance Art Director, Animator, and Illustrator. Founder and Director at noiŕe

                 

                 
                 
                Mohammad AlQaq

                Visual artist and multimedia storytelling trainer. Mohammad uses different artistic platforms to express his ideas: singing, videography, photography, presenting, and acting. In the academic field, he provides assistance by judging art students’ graduation projects. Mohammad relies in his visual arts, photos, and film, on simulating cases that are inspired by the real world and surrounding events.

                 

                Data Visualization

                Data visualization is the last step of the data pipeline. The way we present data in charts and diagrams is very important for the audience so that they can easily understand the topic without effort. Check the following slides/handouts

                Congrats!

                  • You have completed70%

                   

                  Stage 4

                  Ethics and Privacy

                  Data Ethics and Privacy

                  Ethics is very important when working on data-driven projects. Besides considering journalistic ethics, it is really important in Data Journalism to look at each pipeline step from an ethical perspective.

                  Ethics in Data (not limited to)

                  • Sources
                  • Terms of Use and reuse
                  • Access to Data
                  • Limitations
                  • Law inforcement

                  Congrats!

                    • You have completed80%

                     

                    Stage 5

                    Future of DDJ

                    DDJ in Teams

                    A Data journalist is not expected to be a data engineer, a designer, or a coder. However, a data journalist should be aware of all the expertise needed to produce a data-driven report individually. When working in media institutions and production teams, the ability to work in DDJ teams is an essential skill.

                    DDJ Teams components (mainly)
                    • Journalist
                    • Data Analyst
                    • Designer
                    • Web Developer

                    DDJ in the age of AI

                     

                    When it comes to Artificial Intelligence (AI) and the fast growing digitization, data journalists should adapt to the new technologies and tools that can fasten their work and add more value to their reports. In this regard, a lot of AI tools are available to help individual journalists (many of which open-sourced) as well as media outlets or DDJ teams

                     

                    Congrats!

                      • You have completed100%