[IPOL discuss] 1028-1997, IEEE Standard for Software Reviews
Nicolas Limare
nicolas.limare at cmla.ens-cachan.fr
Fri Feb 17 11:13:59 CET 2012
Hi,
> I just noticed there is a IEEE Standard for Software Reviews [1].
> [1]: http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=5362
> (Google "1028-1997, IEEE Standard for Software Reviews" for PDF)
I didn't know of this standard. I found a revised version, named
IEEE Standard 1028-2008. Look for "1028-2008 BASKENT" in Google for a
good quality PDF.
I quickly read it (I quote some passages at the end of this
message). It describes review teams, procedures, meetings,
documentation and reports, but not how reviewers should perform their
task. So, as a whole, I think this standard is not adapted to the
peer-reviews performed on software in a research journal.
For the moment, we have no "Reviewer Handbook". Such document would be
helpful because reviewers are not used to this kind of task, and some
of them do not really know what IPOL expects and how to conduct their
review. The "Software Guidelines" can help them, but it is not
sufficient.
I suggest that after a few reviews, veteran IPOL reviewers write a
short "Reviewer Handbook" to guide the new ones. This could include a
template for the report, with checklists and so on. This handbook
would be proposed to the reviewers, but not mandatory. Daniel, you are
welcome to propose a first draft :)
8<----------8<----------8<----------8<----------8<----------8<----------
I found these interesting passages in the standard:
* It defines 5 categories of reviews: management reviews, technical
reviews, inspections, walk-throughs, audits. IPOL reviews may belong
to the audit category:
«An independent examination of a software product [...] to assess
compliance with specifications [and resulting in] a clear
indication of whether the audit criteria have been met.»
The technical reviews, inspections and walk-throughs match somehow
too. But IPOL reviews are cleary not management reviews.
* Typical inspection rate is between 100 and 200 lines of code per
hour for source code reviews; we have to keep that in mind when we
ask for large codes to be reviewed.
* Software anomalies in technical reviews can be ranked as
catastrophic, critical, marginal or negligible.
* The output of a technical review can be
a) Accept with no verification or with rework verification. The
software product is accepted as is or with only minor rework (for
example, that would require no further verification).
b) Accept with rework verification. The software product is to be
accepted after the inspection leader or a designated member of
the inspection team (other than the author) verifies rework.
c) Reinspect. The software product cannot be accepted. Once
anomalies have been resolved a reinspection should be scheduled
to verify rework. At a minimum, a reinspection shall examine the
software product areas changed to resolve anomalies identified in
the last inspection, as well as side effects of those changes.
--
Nicolas LIMARE - CMLA - ENS Cachan http://www.cmla.ens-cachan.fr/~limare/
IPOL - image processing on line http://www.ipol.im/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 198 bytes
Desc: Digital signature
URL: <http://tools.ipol.im/mailman/archive/discuss/attachments/20120217/67738241/attachment.pgp>
More information about the discuss
mailing list