Skip to content

Should package version requirements assume installation from sources?

9 messages · Mikael Jagan, Hervé Pagès, Ben Bolker +4 more

#
[Arguably also appropriate for R-package-devel, but posted to R-devel
     as the discussion is aimed primarily at "experts" ... ]

We, the authors of Matrix, have encountered a somewhat subtle issue
induced by caching of S4 classes and methods in package namespaces.

The namespaces of three reverse dependent packages (SeuratObject, conText,
mcmcsae) cache the formal definition of our virtual class Matrix (and some
subclasses).  For example:

 > ns <- asNamespace("SeuratObject")
 > grep("^[.]__C__.*Matrix$", names(ns), value = TRUE)
[1] ".__C__dMatrix"       ".__C__compMatrix"    ".__C__AnyMatrix"
[4] ".__C__generalMatrix" ".__C__CsparseMatrix" ".__C__sparseMatrix"
[7] ".__C__dsparseMatrix" ".__C__Matrix"

The cached definition (which includes a _validity method_) is obtained from
the version of Matrix available when the reverse dependent package was built
from sources.  For example, if SeuratObject was built under Matrix 1.4-1,
then we get:

 > getValidity(ns$.__C__Matrix)
function (object)
{
     if (!isTRUE(r <- .Call(Dim_validate, object, "Matrix")))
         r
     else .Call(dimNames_validate, object)
}
<bytecode: 0x11e7ca508>
<environment: namespace:Matrix>

whereas if SeuratObject was built under Matrix >= 1.5-0, then we get:

 > getValidity(ns$.__C__Matrix)
function (object)
.Call(Matrix_validate, object)
<bytecode: 0x107dc1698>
<environment: namespace:Matrix>

There are two "questions" here:

1.  The symbol 'Matrix_validate' is not defined until Matrix 1.5-0.
     Is it necessary, for this reason alone, for SeuratObject to have
     'Imports: Matrix (>= 1.5-0)'?  Or can SeuratObject continue using
     'Imports: Matrix (>= 1.3-3)', at the risk of errors like

     > Error: object 'Matrix_validate' not found

     (as already seen here: https://stackoverflow.com/questions/73700130)?

     Note that this error would not occur for anyone installing SeuratObject
     from sources, unless they decide to _downgrade_ Matrix after doing so.
     Hence this primarily concerns Windows and macOS where R users would
     typically install a binary built by CRAN (i.e., not on their system).

     We are aware that package TMB tests in .onLoad() that the current Matrix
     version is equal to or greater than the version available at build time,
     thus avoiding a "strict" version requirement, but do not want this practice
     to spread ...

2.  For how long should Matrix retain the superceded 'Dim_validate' and
     'dimNames_validate', in order to ensure that "stale" cached validity
     methods continue to work?

We hope that this discussion will highlight the potential ramifications
of importing classes and methods from other packages, and having one's
classes and methods imported _by_ other packages, especially for version
requirements.

Mikael, Martin, and Doug
#
FWIW this happens all the time in Bioconductor, where S4 is used very 
intensively, when we make that kind of change to a core infrastructure 
package. Then a bunch of Windows and Mac binary packages need to be 
rebuilt because they contain cached stuff that is now out-of-sync with 
the new state of affairs. The way we deal with this is by bumping the 
version numbers of all the affected packages. This triggers automatic 
rebuilt and propagation of the new binaries.

Note that these problems also affect packages that were already 
installed on the user machines _prior_ to the changes to the upstream 
package. Because the caching we are talking about is actually in the 
package installation folder (and the fact that it ends up in the 
binaries is just a consequence of that). So even on a system where 
packages are installed from source, the affected packages need to be 
reinstalled. Bumping the version numbers of all the affected packages 
also solves that.

I don't see how this could be handled or mitigated via package requirements.

Best,

H.
On 13/09/2022 14:45, Mikael Jagan wrote:

  
    
#
Interesting.

   How is the version number bumping communicated back to package 
maintainers? Or is there a rule like "third digit-blocks of package 
versions are reserved for Bioconductor modifications" ?
On 2022-09-13 7:11 p.m., Herv? Pag?s wrote:

  
    
#
On 13 September 2022 at 16:11, Herv? Pag?s wrote:
| FWIW this happens all the time in Bioconductor, where S4 is used very 
| intensively, when we make that kind of change to a core infrastructure 
| package. Then a bunch of Windows and Mac binary packages need to be 
| rebuilt because they contain cached stuff that is now out-of-sync with 
| the new state of affairs. The way we deal with this is by bumping the 
| version numbers of all the affected packages. This triggers automatic 
| rebuilt and propagation of the new binaries.

When you control the whole repo, and release it in batch, you can rebuild and
increment versions (more on that below).

| Note that these problems also affect packages that were already 
| installed on the user machines _prior_ to the changes to the upstream 
| package. Because the caching we are talking about is actually in the 
| package installation folder (and the fact that it ends up in the 
| binaries is just a consequence of that). So even on a system where 
| packages are installed from source, the affected packages need to be 
| reinstalled. Bumping the version numbers of all the affected packages 
| also solves that.
| 
| I don't see how this could be handled or mitigated via package requirements.

R uses a bunch of elements from the Debian package graph, namely

 Depends/Imports,
 Recommends,
 Suggests 
 Enhances (even if rarely used)

We may need to think about _the other direction_ via

 Breaks

where here, in Mikael's case, we could possibly say that Matrix: has a
Breaks: on the current (known, built with previous Matrix) packages
SeuratObject, conText, mcmcsae (each with a <= set to their current CRAN
version).  This would ensure that when Matrix updates, those packages also
update.

But it gets trickier because Debian can differentiate between an (upstream
source version), say, Matrix 1.5-1 _and its build versions_, say 1.5-1-1 so
the breakage would be of 1.5-1-1 but resolved by a new binary 1.5-1-2 which
sorts higher.  We don't have that, so maybe we cannot do the Breaks with
proper ordering. BioConductor can because of how it releases, CRAN cannot as
it does alter version numbers, only Maintainers do.

So maybe we need a few field in DESCRIPTION to tag 'RecommendedRebuilds'?
Or maybe CRAN would get a right to do an incremental release, so say 'my'
digest 0.6.29 would become 0.6.29.1 ?  But how to communicate this _in
general_ back to Maintainers?

Obviousy for binary packages we can cover this at the repo level, and CRAN
generally does but there are also 'from source' users out we should cover
here. So in short that is an open issue with no clear solution.

And all of this would of course require new and changed tooling.  As we often
maybe this too is something best tried first at package level via opt-in
alternatives to the functions in base package tools and utils.

Dirk

| Best,
| 
| H.
| 
|
| On 13/09/2022 14:45, Mikael Jagan wrote:
| > [Arguably also appropriate for R-package-devel, but posted to R-devel
| > ??? as the discussion is aimed primarily at "experts" ... ]
| >
| > We, the authors of Matrix, have encountered a somewhat subtle issue
| > induced by caching of S4 classes and methods in package namespaces.
| >
| > The namespaces of three reverse dependent packages (SeuratObject, 
| > conText,
| > mcmcsae) cache the formal definition of our virtual class Matrix (and 
| > some
| > subclasses).? For example:
| >
| > > ns <- asNamespace("SeuratObject")
| > > grep("^[.]__C__.*Matrix$", names(ns), value = TRUE)
| > [1] ".__C__dMatrix"?????? ".__C__compMatrix"??? ".__C__AnyMatrix"
| > [4] ".__C__generalMatrix" ".__C__CsparseMatrix" ".__C__sparseMatrix"
| > [7] ".__C__dsparseMatrix" ".__C__Matrix"
| >
| > The cached definition (which includes a _validity method_) is obtained 
| > from
| > the version of Matrix available when the reverse dependent package was 
| > built
| > from sources.? For example, if SeuratObject was built under Matrix 1.4-1,
| > then we get:
| >
| > > getValidity(ns$.__C__Matrix)
| > function (object)
| > {
| > ??? if (!isTRUE(r <- .Call(Dim_validate, object, "Matrix")))
| > ??????? r
| > ??? else .Call(dimNames_validate, object)
| > }
| > <bytecode: 0x11e7ca508>
| > <environment: namespace:Matrix>
| >
| > whereas if SeuratObject was built under Matrix >= 1.5-0, then we get:
| >
| > > getValidity(ns$.__C__Matrix)
| > function (object)
| > .Call(Matrix_validate, object)
| > <bytecode: 0x107dc1698>
| > <environment: namespace:Matrix>
| >
| > There are two "questions" here:
| >
| > 1.? The symbol 'Matrix_validate' is not defined until Matrix 1.5-0.
| > ??? Is it necessary, for this reason alone, for SeuratObject to have
| > ??? 'Imports: Matrix (>= 1.5-0)'?? Or can SeuratObject continue using
| > ??? 'Imports: Matrix (>= 1.3-3)', at the risk of errors like
| >
| > ??? > Error: object 'Matrix_validate' not found
| >
| > ??? (as already seen here: https://stackoverflow.com/questions/73700130)?
| >
| > ??? Note that this error would not occur for anyone installing 
| > SeuratObject
| > ??? from sources, unless they decide to _downgrade_ Matrix after doing 
| > so.
| > ??? Hence this primarily concerns Windows and macOS where R users would
| > ??? typically install a binary built by CRAN (i.e., not on their system).
| >
| > ??? We are aware that package TMB tests in .onLoad() that the current 
| > Matrix
| > ??? version is equal to or greater than the version available at build 
| > time,
| > ??? thus avoiding a "strict" version requirement, but do not want this 
| > practice
| > ??? to spread ...
| >
| > 2.? For how long should Matrix retain the superceded 'Dim_validate' and
| > ??? 'dimNames_validate', in order to ensure that "stale" cached validity
| > ??? methods continue to work?
| >
| > We hope that this discussion will highlight the potential ramifications
| > of importing classes and methods from other packages, and having one's
| > classes and methods imported _by_ other packages, especially for version
| > requirements.
| >
| > Mikael, Martin, and Doug
| >
| > ______________________________________________
| > R-devel at r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-devel
| 
| -- 
| Herv? Pag?s
| 
| Bioconductor Core Team
| hpages.on.github at gmail.com
| 
| ______________________________________________
| R-devel at r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-devel
#
Mikael,

first about the macOS part of the issue: the first step is that you can tell me and I can trigger a full re-build. For performance reasons the macOS build system does not do full reverse builds, because they take too long as small updates in popular packages can trigger large portions of CRAN needing a re-build - in most cases with no benefit.

Now, that still only addresses one part where the users can upgrade to re-built versions __if they are aware__. However, they would have to know and do it explicitly, because a regular update.packages() would not re-install such reverse-dependent packages since their version has not changed. Therefore I would strongly suggest making sure that updates are backward-compatible, because there is simply no way around that problem as users have no way of knowing that an upgrade of Matrix requires re-installation of many other packages. In fact for Matrix I would argue that such breaking changes should only be allowed with major R release since at that point all packages have to be reinstalled by definition anyway.  So to answer your second part of the question: until next major (=annual) release.

Note that this problem can exist in both directions - even if you upgrade Matrix, if a dependent package is build against newer Matrix __but does not require the newer version__ then the user doesn't know that the installed version is incompatible, so you have to guard in both directions.

Cheers,
Simon
#
On 13/09/2022 5:45 p.m., Mikael Jagan wrote:
This sounds like a bug or bad design in the S4 system, i.e. caching 
things without a way to detect or update when the cache becomes stale.

Is it really necessary to cache things as part of the binary package, or 
could they be put in place when needed using lazy loading, getting a 
copy from the loaded copy of Matrix?

Duncan Murdoch
#
On Tue, Sep 13, 2022 at 8:31 PM Ben Bolker <bbolker at gmail.com> wrote:

            
In practice, when you have a package in Bioconductor, we set up a
bioconductor git repository for the package, which is what is being used to
build Bioc. So we have

bioc git <- used for binary building
optionally, package devel git repos (typically on github)

If "we" (Bioconductor) can then choose to change things in the bioc git
which is upstream of the package devel git. So when Herve changes things to
basic S4 classes, when updating my packages, I pull those changes into my
github repos and merge them.

Regarding version numbers, the rule is that we have three digits x.y.z. We
release Bioc bi-annually and say we release package at version 1.2.0 as
part of Biocondctor 3.15. After release, we bump the version number for
devel, so we create 1.3.0 which only exists in Bioc-devel. Then over time
the developer bumps z as changes are made in devel. Then say we end up at
1.3.17. At release time, this now gets made into 1.4.0.

So we have potentially tons of version bumping where there is no change in
code. This may sound chaotic, but it is being helped by the notion of a
simultaneous release every 6 months. So most users only see 1.2.0 and 1.4.0
(which could be identical but dependencies are likely to change). Once you
understand it, it is a pretty good system IMO, since - for example - I can
immediately see if a package is from devel (odd y) or release (even y). It
does go againsts some people's arbitrary instincts regarding version
numbering; here it also helps that these rules are essentially enforced
across all packages.

Best,
Kasper

  
    
8 days later
#
This issue may be the culprit in an obscure bug that's been reported on 
R-package-devel:  see 
https://stat.ethz.ch/pipermail/r-package-devel/2022q3/008481.html.  It 
appears that some ggplot2 version 3.4.0 code is being run even though 
3.3.6 is the version on CRAN, and 3.3.6 is *also* being run in the same 
check session.

Duncan Murdoch
On 14/09/2022 6:04 a.m., Duncan Murdoch wrote:
#
No, it looks as though this is an unrelated bit of caching being done by 
rstan.  During startup, it saves some ggplot defaults in an internal 
environment ".rstanvis_defaults"; that captured the ggplot2 3.3.6 
defaults, which caused problems when being run under ggplot2 3.4.0.

Duncan Murdoch
On 23/09/2022 5:54 a.m., Duncan Murdoch wrote: