[IPOL discuss] Handbook for Reviwers: your feedback required! :)

Daniel Kondermann daniel.kondermann at iwr.uni-heidelberg.de
Fri Feb 17 13:07:23 CET 2012


Hi!

My main problem during review currently is to understand the aim of the
article - e.g. which audience should be able understand it?

From my point of view it would be great if the audience were Master
students. Therefore, the authors need to either explain all theoretic
derivations as in a tutorial or cite documents which do this job. In
case of some theoretically involved stuff such as graphical models,
texbook pointers should be a minimum requirement.

Next point is in my opinion that each implementation choice needs to be
thoroughly motivated and/or discussed: assume you want to interpolate
image pixels. Why using linear/bicubic/spline/sinc-interpolation for
this specific case? "Our experiments showed..." just means "we have no
clue and we actually don't care (here)!". One answer might simply be:
the authors of the original paper chose it this way, but they did not
explain why. This could make up another minimum requirement which is
special to IPOL. It would help to identify unreflected thoughts in
existing papers.

Finally, it would be great to carefully list the assumptions an
algorithm makes. Usually, this can be done statistically by giving prior
distributions and independence assumptions. This is a difficult task as
most publications make their assumptions implicitly and sometimes even
without knowing it on their own, especially when they are not formulated
in a statistical framework. A great example for a clear motivation with
assumptions in a somewhat heuristic paper is:
http://gfx.cs.princeton.edu/pubs/Barnes_2009_PAR/


I think the first step to create such a Handbook for Reviewers is to
loosely collect all thoughts in this mailing list. I volunteer to
moderate the discussion and to finally organize the information into a
catchword list and a rough document structure. Finally, we can jointly
create the actual document.
As the ECCV deadline is drawing near, I have to take care not to do too
much, but I would guess that we should now discuss this topic (any
reviewers with their experiences and thoughts here?) and see that we can
create the first draft in about a month or so.

Cheers,
Daniel

Am 17.02.2012 11:13, schrieb Nicolas Limare:
> Hi,
> 
>> I just noticed there is a IEEE Standard for Software Reviews [1].
>> [1]: http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=5362
>> (Google "1028-1997, IEEE Standard for Software Reviews" for PDF)
> 
> I didn't know of this standard. I found a revised version, named
> IEEE Standard 1028-2008. Look for "1028-2008 BASKENT" in Google for a
> good quality PDF.
> 
> I quickly read it (I quote some passages at the end of this
> message). It describes review teams, procedures, meetings,
> documentation and reports, but not how reviewers should perform their
> task. So, as a whole, I think this standard is not adapted to the
> peer-reviews performed on software in a research journal.
> 
> For the moment, we have no "Reviewer Handbook". Such document would be
> helpful because reviewers are not used to this kind of task, and some
> of them do not really know what IPOL expects and how to conduct their
> review. The "Software Guidelines" can help them, but it is not
> sufficient.
> 
> I suggest that after a few reviews, veteran IPOL reviewers write a
> short "Reviewer Handbook" to guide the new ones. This could include a
> template for the report, with checklists and so on. This handbook
> would be proposed to the reviewers, but not mandatory. Daniel, you are
> welcome to propose a first draft :)
> 
> 8<----------8<----------8<----------8<----------8<----------8<----------
> 
> I found these interesting passages in the standard:
> 
> * It defines 5 categories of reviews: management reviews, technical
>   reviews, inspections, walk-throughs, audits. IPOL reviews may belong
>   to the audit category:
>       «An independent examination of a software product [...] to assess
>       compliance with specifications [and resulting in] a clear
>       indication of whether the audit criteria have been met.»
>   The technical reviews, inspections and walk-throughs match somehow
>   too. But IPOL reviews are cleary not management reviews.
> 
> * Typical inspection rate is between 100 and 200 lines of code per
>   hour for source code reviews; we have to keep that in mind when we
>   ask for large codes to be reviewed.
> 
> * Software anomalies in technical reviews can be ranked as
>   catastrophic, critical, marginal or negligible.
> 
> * The output of a technical review can be
>   a) Accept with no verification or with rework verification. The
>      software product is accepted as is or with only minor rework (for
>      example, that would require no further verification).
>   b) Accept with rework verification. The software product is to be
>      accepted after the inspection leader or a designated member of
>      the inspection team (other than the author) verifies rework.
>   c) Reinspect. The software product cannot be accepted. Once
>      anomalies have been resolved a reinspection should be scheduled
>      to verify rework. At a minimum, a reinspection shall examine the
>      software product areas changed to resolve anomalies identified in
>      the last inspection, as well as side effects of those changes.
> 
> 
> 
> 
> _______________________________________________
> discuss mailing list
> discuss at list.ipol.im
> http://tools.ipol.im/mailman/listinfo/discuss


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 552 bytes
Desc: OpenPGP digital signature
URL: <http://tools.ipol.im/mailman/archive/discuss/attachments/20120217/de32f732/attachment.pgp>


More information about the discuss mailing list