Practical Guide To Scripting, Shooting and Post-Production
Guide to Creating Videos for Journalists and Content Creators.
Journalistic videos serve different purposes depending on the context and goals of the storyline.
This is a detailed guide to creating various types of journalistic videos, including their definitions, purposes, tips, and workflows.
AI tools for journalists is guided by a few questions:
- What is the primary message or goal of the video?
- Who is the target audience for this video?
- What tone or mood are you aiming to achieve?
- How long would you like your video to be?
Approach for the Series:
- Goal: Equip journalists and researchers with knowledge and tools to identify and counteract harmful media content.
- Audience: Professional journalists, trainee journalists, students, and researchers.
- Tone: Authoritative yet approachable and pedagogical, suitable for both in-class and online learning.
- Video Length: ~3 minutes per video (short enough to keep attention but detailed enough to be educational).
Expanded Syllabus: Critical Media Literacy and Epistemic Scaffolding for Journalists
Objective: Empower journalists with critical thinking tools, ethical grounding, and awareness of cognitive biases to produce accurate, fair, and impactful news stories while leveraging AI and digital tools. This syllabus emphasises epistemic scaffolding to build journalists’ capacity to analyse, deconstruct, and challenge their own beliefs and biases, ensuring credibility in the digital age.
Module 1: Foundations of Critical Media Literacy
- Understanding Media Manipulation
Content: Definitions and distinctions: disinformation, disinformation, malinformation. Propaganda tactics: framing, agenda-setting, emotional manipulation.
Media literacy exercises: analysing real-world examples of manipulated content.
Critical Component: Identifying how personal beliefs influence susceptibility to manipulated content. Tools for self-reflection: journaling instances where personal biases impacted judgment.
- Spotting Red Flags in Digital Content
Content:Techniques to recognize biased headlines, exaggerated claims, and logical fallacies. Algorithms and echo chambers: how personalization feeds biases. Practicum: evaluating social media feeds to identify echo chamber effects.
Critical Component: Building epistemic awareness: questioning the origins and credibility of information. Addressing confirmation bias: frameworks for evaluating contrary evidence objectively.
Module 2: Fact-Checking Tools and Techniques
- AI Tools for Fact-Checking: Introduction to tools: FactCheck.org, PolitiFact, Snopes.
Integrating AI for fact-checking at speed during breaking news.
Workflow for verifying sources and citations in user-generated content.
Critical Component: Avoiding reliance on a single source: triangulating facts.
Recognising bias in fact-checking tools and platforms themselves.
- Image and Video Verification with AI: Tools: InVID, Forensically, and Google Reverse Image Search.
Steps for analyzing image metadata, geolocation, and timestamps.
Practical task: Verify the authenticity of viral media.
Critical Component: Self-assessment: How biases influence assumptions about image sources.
Case study analysis: Impact of prejudiced assumptions on misreporting.
Module 3: Countering Deepfakes and AI-Generated Content
- Detecting Deepfakes and Synthetic Media: (a) Technical overview of deepfake creation and detection tools, (b) Tools: Deepware Scanner, Microsoft’s Video Authenticator and (c) Workshop: Identifying subtle inconsistencies in deep fake videos.
Critical Component: Epistemic vigilance: questioning the authenticity of “too-perfect” narratives.
Ethics of reporting on deep fakes: avoiding unintended amplification.
- Combating Synthetic Text and Audio Content: (a) Identifying AI-generated text and audio using tools like Originality.AI.,(b) Common patterns in AI content: uniform tone, lack of nuance and (c) Practical exercise: Distinguish AI-generated articles from human-written ones.
Critical Component: Exploring cognitive biases: How preconceived notions affect judgments of authenticity. Discussing ethical dilemmas: balancing skepticism with fairness.
Module 4: Data Visualisation and Storytelling
- Ethical AI-Generated Graphics and Visuals: (a) Overview of Canva, Adobe Firefly, and DALL-E.
(b) Best practices for using AI-generated visuals responsibly.
(c) Exercise: Create visuals to complement a news story ethically.
Critical Component: Examining bias in visual data representation: scale, colors, and framing.
Self-reflection: How personal aesthetics may skew graphic design choices.
- Building Credible Data Visualisations: (a) Tools: Flourish, Datawrapper, and Tableau, (b) Avoid misleading visualisations: axis manipulation, cherry-picking data. (c) Exercise: Recreate flawed visualisations and correct them.
Critical Component: (a) Understanding data bias: questioning the origin, methodology, and intent behind data sets. (b) Fostering a culture of accountability: ensuring visuals aid rather than distort understanding.
Module 5: Leveraging AI for Research and Writing
- Using AI for Trend Analysis and Monitoring: (a) Tools: CrowdTangle, BuzzSumo, and Google Trends, (b) Workflow: Identifying emerging stories and audience engagement patterns and (c) Exercise: Generate story leads using trend analysis tools.
Critical Component: Avoiding the “bandwagon effect”: critically analysing why certain trends gain traction. Recognising platform biases: how algorithms prioritise content visibility.
- Effective Storytelling with AI Assistance: (a) Tools: Jasper AI, Writesonic, and Grammarly, (b) Balancing AI suggestions with journalistic judgment and voice and (c) Exercise: Collaboratively write an article with AI tools.
Critical Component: (a) Mitigating overreliance on AI-generated suggestions: retaining human oversight and (b) Self-assessment: How AI recommendations align or clash with personal biases.
- Integrating AI for Social Media Content Creation: (a) Tools: Lumen5, VEED.io, and Descript, (b) Crafting multimedia posts that prioritise accuracy and clarity and (c) Workshop: Create a 60-second video story for social media.
Critical Component: Ethical considerations: Avoiding sensationalism for engagement, Reflection: Balancing personal branding with professional objectivity.
- Using AI for Audience Engagement and Feedback: (a) Tools for analytics: Hootsuite, Buffer, and Sprinklr, (b) Strategies for responding to audience feedback constructively and (c) Exercise: Analyse audience metrics to refine storytelling approaches.
Critical Component: Recognising implicit biases in interpreting audience data, Ethical questions: How much influence should audience preferences have on editorial decisions?
- Safeguarding Against Harmful Content: (a) Tools for filtering harmful content: Hive Moderation, WebPurify, (b) Case studies: Lessons from organizations that failed to mitigate harm and (c) Workshop: Develop content moderation guidelines for a newsroom.
Critical Component: Questioning where to draw the line: balancing free speech and harm prevention. Addressing subconscious biases when categorizing “harmful” content.
- The Future of Journalism in the AI Era: (a) Trends in AI and journalism: personalisation, automation, and ethics, (b) Skills for staying competitive in a changing landscape and(c) Discussion: Debate on the journalist’s role in an AI-driven future.
Critical Component: Building epistemic resilience: continuously updating skills and perspectives. Fostering collaborative accountability: newsroom practices to mitigate individual biases.
Key Features of the Course
Assessments: Reflective writing, group debates, and content creation projects.
Learning Methods: Interactive workshops, case study analysis, AI tool demos.
Final Project: A comprehensive multimedia news report demonstrating critical literacy and ethical practices.