Week 8 Learning Journal Post

Part 1: Peer Video Project Evaluations

Group 4 - Quantum Computing: It’s Not Just Sci-Fi Anymore

1. Topic Coverage:
The team provided a clear and thorough explanation of quantum computing, including foundational concepts such as qubits, superposition, entanglement, and quantum gates. The video also highlighted current developments in the field and future implications.

2. Presentation Clarity:
The narration was well-paced and easy to follow. Transitions between sections were smooth and logical. A few technical terms could have benefited from additional visual support.

3. Research Quality:
High. The video referenced real-world examples like IBM Qiskit, Shor’s Algorithm, and Google’s quantum supremacy, showing strong evidence of research.

4. Production Quality:
The visuals and music were well-chosen, but the video would have been more impactful with the addition of labeled diagrams or animations for complex topics.

5. Engagement:
The introduction and conclusion helped frame the topic’s relevance. The content stayed engaging throughout by connecting abstract ideas to real-world applications.

6. Team Collaboration:
Clearly evident. Each section flowed well, indicating coordinated scripting and production.

7. Audience Fit:
Well-suited for a CS or tech-savvy audience. Concepts were explained with enough context for understanding without oversimplifying.

Suggestions:
Consider adding more graphical comparisons between classical and quantum computers to enhance visual comprehension.

Group 7 - AI Agents: Digital Workers

1. Topic Coverage:
The team provided a well-rounded overview of AI agents, covering how they function, the process they follow, and their applications across industries.

2. Presentation Clarity:
Very strong. Definitions and processes were explained clearly and supported by examples. The structure made it easy to follow the content.

3. Research Quality:
Excellent. The team referenced platforms like AutoGPT, GitHub Copilot, and real-world applications in customer service, healthcare, and emergency response.

4. Production Quality:
The visuals were consistent, though somewhat minimal. The content would benefit from more animation or interaction to support deeper engagement.

5. Engagement:
Effective use of industry examples helped maintain interest. The AI agent lifecycle explanation added depth to the presentation.

6. Team Collaboration:
The project felt cohesive and well-coordinated. Roles and sections were balanced, and the overall pacing was smooth.

7. Audience Fit:
Appropriate for a professional or academic audience with some familiarity in computer science. The terminology and examples were on target.

Suggestions:
Adding brief visuals or animations of an AI agent in action would make abstract concepts more tangible.

Group 8 - Drones: Friend or Foe?

1. Topic Coverage:
The video explored drones from both historical and modern perspectives. It highlighted both benefits (e.g., agriculture, disaster relief, deliveries) and risks (privacy, safety, and military use).

2. Presentation Clarity:
Generally clear. Some technical portions, such as the explanation of YOLO v3, could be simplified for broader understanding.

3. Research Quality:
Strong. The video integrated historical references, modern use cases, and real-world incidents effectively. More source citations would further strengthen credibility.

4. Production Quality:
Good, though visuals were sometimes static. The video would benefit from more motion graphics or footage to support narrative pacing.

5. Engagement:
Well-structured and engaging. Including current drone-related incidents added relevance and urgency to the topic.

6. Team Collaboration:
Teamwork was evident in the organized delivery and distribution of content. Each section contributed to the overall balance of the presentation.

7. Audience Fit:
Some sections were technical, while others were well-suited for a general audience. Slight simplification of AI models and drone delivery processes would improve accessibility.

Suggestions:
Consider adding simple visuals or animations to explain how object detection and drone navigation work.

Part 2: Final Reflection - Video Project and Course Takeaways

Our final project was titled “Digital Immortality: Can We Live Forever Through Technology?” We explored how technology is being developed to digitally preserve memories, voices, and personalities, and what ethical implications this raises.

Our team primarily communicated via Discord, using voice channels and chat to coordinate. We used Filmora and Adobe Premier Pro for video editing, Canva for design elements, and Artlist.io for audio. Each member contributed research, script writing, and voiceover work, which made the process collaborative and efficient.

One area for improvement would be setting earlier internal deadlines. Some parts of the project came together late, which limited our time for final polishing. In future collaborations, I would suggest creating a shared timeline with checkpoints to keep the team on track.

What I Learned in CST 300

This course helped me improve my ability to present technical topics in a clear, audience-focused way. I’ve also developed better collaboration skills and learned to balance creativity with structure during team projects. Additionally, the ethics essay and weekly reflections pushed me to think critically about the broader implications of technology.

Overall, this course was a valuable experience that strengthened both my communication skills and my understanding of the CS program and its expectations.

Comments

Popular posts from this blog

My Educational and Career Goals

Week 5 Learning Journal

Learning Journal – Week of May 11, 2025