[IPOL discuss] Suggested changes in Submission Procedure/Author Manual

Pascal Monasse monasse at imagine.enpc.fr
Tue Jul 4 16:55:21 CEST 2017


Dear all,

I agree that the manuals/guidelines/web pages need serious update, but I would 
not advise to remove all the restrictions unilaterally. I answer point by 
point in the following.

> It's not true any more that we only accept C/C++ code, and there's no
> reason why the algorithms can't go further than 1 Gb of RAM usage,
> since we have a lot of memory in the servers. They should not use an
> exaggerated amount of memory, though.

OK for C/C++, we have been accepting other languages for a long time.
Concerning the RAM usage, indeed the servers can support much more than 1Gb, 
but that is not the only criterion. We would like the code to be able to run 
on regular machines, not only big servers. We could discuss whether 1Gb is too 
low a target for current machines, but I think that setting such a target is a 
good thing. It could still not be a hard requirement, but just a 
recommendation, that is allowed to be overridden if the editor agrees.

> - We accept not only C/C++ but also Python and MATLAB and we're open
> to other languages eventually.

Indeed, we need to update the docs.

> - All those technical terms give the impression that we're going to
> reject anything which does not follow exactly the norm of the compile
> (C89, C99, C++98), which is not true.

Still, having standard conforming code for C and C++ seems important to me for 
portability. Naturally, as Python and Matlab have no such strict standards, 
this does not apply to them.

> - We give a list of allowed libraries, but these technical details
> should be discussed with the editor, not put here as a strong
> requirement. The editor should tell the editor what is possible and
> what is not, instead of writing "only libtiff, libjpeg, libpng,
> libgsl, eigen, zlib, fftw, cblas and clapack external libraries".

Again, for portability, it is important to delimit the dependences that are 
allowed. It is said in the guidelines that other libraries must be included in 
the package, so I think it is not too restrictive.

> - We say "need at most 1 GB memory and 30 s computation". Again, this
> is flexible. It seems that if your algorithm takes 45 seconds it's
> going to be rejected. And that's not true at all. It should be a
> discussion with the editor instead of a hard limit.

That could be up to discussion with the editor, but again the 30s computation 
should be a strongly recommended target. This may oblige the authors to 
restrict strongly the input loads of their demo, but if a demo takes more than 
30s, people will not have the patience to try it.

> - "read/write PNG, TIFF, PNM, EPS, SVG, VRML or PLY format." --> Same
> comment here. Actually, the system can perform many conversions if
> needed.

Yes it can, but do we want to multiply the formats? This should cover most 
needs, this list can be expanded based on authors' requirements in accordance 
with the editors, but sticking to widely used formats is normally a good 
thing.

> We should remove lines such as:
> - C89, C99 or C++98 code tested with gcc -std=xxx -Wall -Wextra -Werror

Why? It is a good requirement for code quality, not overly difficult to comply 
with.

> We should remove lines such as:
> - compilation with make or cmake, only standard options, make uses
> $(CC) or $(CXX)

Why multiply arbitrarily the build methods? At least for C and C++ code, we 
want to stick with the dominant build methods.

> We should remove lines such as:
> - max 80 characters per line, max 1000 lines per file

Again, why? We do not want messy code, this is not too much to ask from 
authors.

> We should remove lines such as:
> -  This file archive can either be a single volume .ZIP compressed
> archive or a GZIP compressed tar archive2. The size of the compressed
> archive file should be less than 2 MB --> There's no reason to limit
> to 2 Mb!

OK for that one, after all it is just a question of what the server can 
support.

> -  A published software must not be distributed with binary
> precompiled files if these files can be obtained from source code -->
> Actually, it should NEVER contain precompiled files!

Agreed.

> - Other techniques for accelerated computing, like OpenCL and OpenACC,
> are not supported by the journal --> We support it in the new system
> if we manage to have a server with a GPU.

I have no strong opinion on that. Of course, the server could support those.

> Again, this is should be fixed, since it's blocking without any reason
> a lot of eventual submissions.

The omission of Matlab and python may indeed have that effect, I do no think 
that is the case for the other requirements.

> "I only work with Windows (resp. Linux), I don't think Linux (resp.  
> Windows) compatibility is important." --> This is irrelevant and  
> confusing here. We ask for the minimal requirement that the code  
> compiles and works in Debian Stable, but we won't make any comments if  
> it works/don't work in any of the thousands of other systems out there.

Having C/C++ compliant code should answer that. Yes, we want Linux 
compatiblity (why just Debian Stable?). Even though Windows compatibility may 
not be required (the build can be much more complex on Windows because there 
is no standard location for dependences), it could still be a recommendation. 
Concerning my demos, I had several times people requesting help for 
installation on Windows machines, so I put now some effort into ensuring such 
integration is easy. Other authors may not care about that, but I am glad that 
people are using my code.

Best,
Pascal


More information about the discuss mailing list