Once data preparation is complete and the team is ready for model development, PMI-aligned AI lifecycle guidance calls for clear definition and documentation of performance metrics and success criteria before training models. The project manager should ensure that everyone agrees on which metrics will be used (e.g., accuracy, precision, recall, F1, AUC, business KPIs) and what thresholds will be considered acceptable. This supports traceability, objective evaluation, and transparent go/no-go decisions in later stages.
Because the question states that the data is already prepared and the team is ready to proceed, it implies that initial data quality activities have already occurred. Repeating a “final assessment of data quality” (option A) is less critical at this specific point than locking in evaluation metrics. Go/no-go questions (option C) and scalability reporting (option D) depend on having those metrics explicitly defined; they are downstream decisions and artifacts. PMI-style AI guidance stresses that model development should be driven by pre-defined, documented performance metrics that connect technical outputs to business value and risk tolerances. Therefore, the next action for the project manager is to document the performance metrics for the model.
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit