A. Hornof - 4-24-2018
Final projects should be delivered by means of one team member depositing a single .zip file of all project documents and files into Canvas before date and time that the project is due.
Unfortunately, some projects receive lower grades than they should simply because it is not clear where to find some of the team's work products, work products are missing, or it is not clear exactly how to run and use the application. Please make sure that you have multiple team members review the final .zip that will besubmitted well before the final date and again before delivery. The following checklist is not complete but may help avoid some of the more obvious errors:
A total of 100 points are possible on each project, allocated as described in this document. Note that a single problem (e.g., inadequate documentation, or a program bug) can cost points in more than one category. For example, if the documentation is not adequate for using a feature, you won't get credit for implementing that feature, nor will you get credit for documenting a feature that doesn't actually work (you may instead lose points for inaccurate documentation). Likewise, if a feature is implemented but cannot be used because the program crashes, you won't get credit for implementing that feature.
This initial deliverable for the project will be evaluated by assigning 0 to 1 point to each of the following fifteen bullets based on how well each was completed.
Throughout all sections: Good Writing (From Syllabus) Structure the paper so that the main ideas are clearly accessible. Communicate individual ideas effectively. All spelling and grammar must be standard and correct.
Application quality includes features used by all categories of user. Often this includes administrative users (sysadmins, etc.) in addition to end users. Functionality also includes performance and scalability.
This evaluates the extent to which the system can be run with no bugs or errors, such as by failing to install and compile, by behaving differently than specified, or by crashing.
This evaluates the extent to which the system exceeded, met, or fell short of requirements. This considers requirements that were assigned and specified by the team. This evaluates whether essential features were missing and whether extended features were provided.
This category includes setup (if any) as well as usability for end users and for administrators. It is typically distinct from quality of documentation, but may include presence and usefulness of online help. Usability considerations for administrative users may be quite different than usability considerations for end users; both are considered here. For a prototype, this addresses both existing and planned features.
This evaluates the extent to which documentation clearly and accurately describes, for an end-user, how to accomplish,real-world tasks using the software. This includes how to install and set up the system. The documentation should be clear, accurate, well-written, well-organized, and complete.
Technical documentation and system organization are evaluated together. Good documentation can make a good design evident, and poor documentation can doom a good design to degradation over time, but good documentation cannot compensate for poor design.
This section is evaluated on how effectively the team preforms project planning, how well it followed the plan, how well the team adapts the plan to inevitable hiccups, and whether risks are effectively considered, mitigated, and re-evaluated when plans change.
For full points, a report should clearly indicates who did what, when they did it, and how long everyone spent on each of their tasks (assigned date and completed date, as well as time on task). There should be a clear record of when meetings were held, and what was accomplished and agreed upon at each meeting. A series of continually-updated project plans should show the status of major project milestones and deliverables at regular intervals during the project lifecycle.
This should include all elements described in the initial SRS submission, updated to reflect what was actually built. This should include a clear, complete, and well-organized description of requirements, including a rationale for what is included, what has been deferred to the future, and (as appropriate) what has been excluded. The document should provide a clear user-centric specification as well as a well-defined, precise technical specification. Future developers should be able to use the document for system creation or maintenance.
The software design is communicated as follows:
This section should provide an easily understandable specification of the system architecture. Key architectural design decisions should be communicated, along with the rationale for each. The design should satisfy stated design goals.
This evaluates the technical documentation; that is, the human-readable text that a programmer or human-installer would use to understand (a) how to install the software, (b) software dependencies, (c) required versions of components, (d) how source code files relate to each other, (e) the purpose and inner workings of each source code file, function, if-then statement, and loop, (f) each line of source code, and (g) the purpose, scope, and lifetime of each variable. This includes documents that are separate from the code as well as comments within the code.
Created by A.Hornof, 4/24/18. Adapted from materials created by
Stuart Faulk Michal Young.
Removed from Dr. Faulk's evaluation criteria: QA Planning, Developer Logs, and Overall SE and Project Control.