Project Milestone 3: Final Documentation & Analysis
| Weight: 15% | Word Limit: 1000-2000 words |
| Location: ISTB12 212 | Date: May 8 | Time: 2:30 pm - 4:20 PM |
Each Milestone consists of two parts: Documentation and Technology. Milestone 3 serves as your "Scientific Dossier," focusing on validation, performance benchmarking, and ethical evaluation. We will be using the allocated final exam time for the Peer valuation of Milestone 3.
1. Documentation
The final report must move beyond a simple description of "how it works" to a rigorous evaluation of "how well it works". It requires empirical data and critical analysis of your system's performance, even if the final demonstration was unsuccessful.
Required Sections
- Graphical Abstract: A concise, pictorial summary of your project mission, key algorithm, and results. This should appeal to an interdisciplinary audience and will be used for the course gallery [Refer [1], [2]].
Note: You may use AI-generated images; however, ensure all text is legible or manually overlaid to guarantee clarity.
- Algorithm: The core technical contribution of your custom module(s). Provide formal logic using LaTeX (e.g., state-space models, prediction/update equations, etc.) [ Refer [3]].
- Benchmarking & Results: Empirical evidence derived from the "Final Mission" execution. It may include:
- Accuracy: Time-series or spatial plots comparing Ground Truth vs. Estimated Pose.
- Error Analysis: Quantitative data on Cross-Track Error (CTE) or Covariance Ellipse growth.
- Success Rate: Reliability data across at least 10 trials.
- Ethical Impact Statement: A (~300 words) critical analysis covering Privacy (data handling/blurring), Safety (kinetic energy management), and Bias (hardware limitations like LiDAR vs. glass). Apply the Utilitarian, Justice and
Note: If a category doesn't apply now, consider future iterations. Engineering for society requires anticipating these impacts.
- Custom Module Code Links: Every custom module mentioned must include direct hyperlinks to the specific file and/or line number in your GitHub repository.
-
Individual Contribution & Audit Appendix
Team Member Primary Technical Role Key Git Commits/PRs Specific File(s) Authorship Student Name 1 Perception & Filtering Commit a1b2c3dcustom_ekf_node.pyStudent Name 2 Path Planning Commit e5f6g7hrrt_star_planner.cppStudent Name 3 Integration & Hardware Commit i9j0k1lrobot_launch.pyNote on Audits: Points for the "Algorithmic Factor" will be forfeit if
gitlogs reveal the claimant is not the actual author of the custom logic.
2. Peer Review Audit & Grading
Teams will be split into Presenters (stationary) and Reviewers (shuffled into 3-person squads from different teams).
2.1. Presentation Format
Each presentation is 15 minutes long. Every team member must present a portion of the work. You are not required to create a separate slide deck; instead, you will present/talk directly from your Documentation (Milestones 1–3) and code repository.
Content Requirements:
- Aim & Motivation: Why was this project chosen and what problem does it solve?
- System Design: Detailed walk-through of your ROS 2 graph and hardware/simulation setup.
- Custom Logic: A deep-dive into your algorithm/custom logic and where it lives in the source code.
- Results & Analysis: Live playback of your demo video with a walk-through of your benchmarking plots.
- Ethical Statement: A defense of your ethical impact analysis.
2.2 The Audit Flow (90 Mins)
| Phase | Duration | Teams 1-5 | Teams 6-10 |
|---|---|---|---|
| Setup 1 | 10 mins | Set up reports and video demos | Shuffling into Review Squads |
| Round 1 | 30 mins | Presenting to Review Squads | Auditing Stations 1-5 |
| Finalize 1 | 5 mins | Collect feedback | Submit scores for Round 1 |
| Setup 2 | 10 mins | Prepare to move / Re-shuffle | Set up reports and video demos |
| Round 2 | 30 mins | Auditing Stations 6-10 | Presenting to Review Squads |
| Finalize 2 | 5 mins | Submit scores for Round 2 | Collect feedback |
2.3 Peer Rubric
| Criteria | Pts | Auditor Focus |
|---|---|---|
| System Design | 2 | Clarity of Graphical Abstract and ROS rqt_graph/system design graph. |
| Logic/Algorithm Trace | 3 | Custom logic and mapping of logic directly to specific code lines. |
| Demo Success | 2 | Robustness and Completion Status of work captured in a video. |
| Results and Analysis | 3 | Quality of benchmarking data and/or technical challenge depth. |
| Ethical Defense | 2 | Understanding of safety risks and bias in future iterations. |
| Code Review | 3 | Readability of custom nodes, modularity, and proper use of ROS 2 interfaces. |
| Total | 15 | Average peer score |
3. Individual Contribution Scenarios
Weighting: 70% Technology (10.5 pts), 30% Documentation (4.5 pts)
In Milestone 3, the 70/30 Tech Split is introduced to differentiate "Using ROS" vs. "Engineering for ROS."
| Team Profile (Base Points) | Student Profile (Multiplier) | Doc (30) | Tech: Base (70) | Indiv. Factor | Total Pts |
|---|---|---|---|---|---|
| Gold Standard | Lead: Authored custom logic. | 30 | 70 | 1.0 | 15.0 |
| Gold Standard | Integrator: Owned Infra (Launch/TF). | 30 | 70 | 1.0 | 15.0 |
| High Perf. | Passenger: No Git presence. | 30 | 70 | 0.0 | 4.5 |
| Broken Integ. | Documentator: Wrote only documentation. | 30 | 40 | 0.75 | 7.9 |
| Hardware Team | Lead: Physics/System Testing. | 30 | 70 | 1.0 | 15.0 |