This might fall under the purview of bundles, but I could not find any example bundles which demonstrated what I am after. I would like to construct two packages (A, B) which utilize a number of common C functions. The most straightforward way to do this is just copy the relevant .c and .h files from one src directory to the next, but this is tedious especially in the face of multiple developers and changes. If I declare in the depends field that package A needs to be present in order to install package B this only enforces that package A has been installed correct? Is there a way to check whether the src of a package is available and to be able to compile against it (the source of a package does not seem to be installed by default so this might, in general, be impossible)? Linking against installed packages seems to be easier in the sense that I know that if a package is installed that uses native code the .so is available, but is there a makevars variable which I can use to tell R to add to its linking command upon R CMD INSTALL? Does anyone have examples of configure scripts which are doing this by hand? I could see this as being a relatively painless addition for linking by mapping any dependency specified in the depends field (in DESCRIPTION) to additional dependencies in the list of directories to link against, but in terms of compiling I don't see an obvious solution besides writing it myself in the configure, but then it might make it much harder for the user to install. Sorry if this is in the help manual - I have looked at places where I thought it might naturally be, but did not see anything. Thanks in advance, jim
multiple packages using the same native code.
5 messages · James Bullard, Seth Falcon, Duncan Temple Lang
Hi Jim, James Bullard <bullard at berkeley.edu> writes:
I would like to construct two packages (A, B) which utilize a number of common C functions. The most straightforward way to do this is just copy the relevant .c and .h files from one src directory to the next, but this is tedious especially in the face of multiple developers and changes.
I'm not sure I understand what you are after. One possible solution
would be to create a third package 'C' that contains the common C
code. This would allow you to call C function defined in 'C' from the
C code in 'A' or 'B'.
Using a .onLoad hook and getNativeSymbolInfo(), you can pass C
function pointers to the code in packages A and B.
Suppose in 'C' you have a C function foo() that is registered in the
usual manner so that it can be called by .Call or .C.
Then in 'A' you could have (all untested, sorry, but hopefully it
sketches the idea for you):
A/src/A.c
static DL_FUNC C_foo;
void init_funcs_from_C(SEXP foo_info) {
C_foo = R_ExternalPtrAddr(foo_info);
}
void bar(int *x) {
...
z = C_foo();
...
}
A/R/zzz.R
.onLoad <- function(libname, pkgname) {
foo_info <- getNativeSymbolInfo("foo", PACKAGE="C")
.Call("init_funcs_from_C", foo_info$address)
}
+ seth
1 day later
Seth, thanks for the advice. This solution seems like it might work, but
then all errors occur at runtime rather than at compile time. This seems
like I am exchanging one evil for another (run time segfaults versus
code duplication) Lets say we have these three package A, B, and C
defined more or less like this:
A/src/bar.c
int bar()
{
foo();
}
B/src/baz.c
int baz()
{
foo();
}
C/src/foo.c
int foo()
{
return 1;
}
Now, the only way I can see to do this is to copy foo.c into both src
directories of package A and B. This is not exactly what anyone wants,
but rather I'd rather just say that both package A and B depend on
package C. If I put them in a bundle then can I expect that the src will
always simultaneously be available? In this way I can easily modify the
configure script to handle this, but if I have no way to depend on the
presence of the code (ie. users could download and install packages
separately even if it's a bundle) then it seems like there is no way to
generally modify the configure file to do this.
thanks, jim
Seth Falcon wrote:
Hi Jim, James Bullard <bullard at berkeley.edu> writes:
I would like to construct two packages (A, B) which utilize a number of common C functions. The most straightforward way to do this is just copy the relevant .c and .h files from one src directory to the next, but this is tedious especially in the face of multiple developers and changes.
I'm not sure I understand what you are after. One possible solution
would be to create a third package 'C' that contains the common C
code. This would allow you to call C function defined in 'C' from the
C code in 'A' or 'B'.
Using a .onLoad hook and getNativeSymbolInfo(), you can pass C
function pointers to the code in packages A and B.
Suppose in 'C' you have a C function foo() that is registered in the
usual manner so that it can be called by .Call or .C.
Then in 'A' you could have (all untested, sorry, but hopefully it
sketches the idea for you):
A/src/A.c
static DL_FUNC C_foo;
void init_funcs_from_C(SEXP foo_info) {
C_foo = R_ExternalPtrAddr(foo_info);
}
void bar(int *x) {
...
z = C_foo();
...
}
A/R/zzz.R
.onLoad <- function(libname, pkgname) {
foo_info <- getNativeSymbolInfo("foo", PACKAGE="C")
.Call("init_funcs_from_C", foo_info$address)
}
+ seth
______________________________________________ R-devel at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
James Bullard <bullard at berkeley.edu> writes:
Seth, thanks for the advice. This solution seems like it might work, but then all errors occur at runtime rather than at compile time.
I'm sure you could still create some compile time errors ;-) Yes, doing things dynamically means you won't catch nearly as much at compile time. I think with some reasonable testing this really isn't so bad. Another option would be to develop a separate C library and have the configure scripts for 'A' and 'B' find where it is installed.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 There is work underway to be able to handle this concept of a package providing native code to other packages. It is done in several packages already, but it is time to make the package mechanism extensible and this feature is one of the motivating examples. It probably won't make it into 2.3.0 as I am only just finishing a quarter of intense teaching, but it will be available reasonably soon (i.e. a month or so). Copying the code is the most natural approach, but this does not support the important case where one wants a single instance of a shared native symbol, i.e. a global data object. There are several situations that we want to be able to support and these will be possible via a package mechanism that relies more on R code than shell & Perl scripts.
James Bullard wrote:
Seth, thanks for the advice. This solution seems like it might work, but
then all errors occur at runtime rather than at compile time. This seems
like I am exchanging one evil for another (run time segfaults versus
code duplication) Lets say we have these three package A, B, and C
defined more or less like this:
A/src/bar.c
int bar()
{
foo();
}
B/src/baz.c
int baz()
{
foo();
}
C/src/foo.c
int foo()
{
return 1;
}
Now, the only way I can see to do this is to copy foo.c into both src
directories of package A and B. This is not exactly what anyone wants,
but rather I'd rather just say that both package A and B depend on
package C. If I put them in a bundle then can I expect that the src will
always simultaneously be available? In this way I can easily modify the
configure script to handle this, but if I have no way to depend on the
presence of the code (ie. users could download and install packages
separately even if it's a bundle) then it seems like there is no way to
generally modify the configure file to do this.
thanks, jim
Seth Falcon wrote:
Hi Jim, James Bullard <bullard at berkeley.edu> writes:
I would like to construct two packages (A, B) which utilize a number of common C functions. The most straightforward way to do this is just copy the relevant .c and .h files from one src directory to the next, but this is tedious especially in the face of multiple developers and changes.
I'm not sure I understand what you are after. One possible solution
would be to create a third package 'C' that contains the common C
code. This would allow you to call C function defined in 'C' from the
C code in 'A' or 'B'.
Using a .onLoad hook and getNativeSymbolInfo(), you can pass C
function pointers to the code in packages A and B.
Suppose in 'C' you have a C function foo() that is registered in the
usual manner so that it can be called by .Call or .C.
Then in 'A' you could have (all untested, sorry, but hopefully it
sketches the idea for you):
A/src/A.c
static DL_FUNC C_foo;
void init_funcs_from_C(SEXP foo_info) {
C_foo = R_ExternalPtrAddr(foo_info);
}
void bar(int *x) {
...
z = C_foo();
...
}
A/R/zzz.R
.onLoad <- function(libname, pkgname) {
foo_info <- getNativeSymbolInfo("foo", PACKAGE="C")
.Call("init_funcs_from_C", foo_info$address)
}
+ seth
______________________________________________ R-devel at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
______________________________________________ R-devel at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
- -- Duncan Temple Lang duncan at wald.ucdavis.edu Department of Statistics work: (530) 752-4782 4210 Mathematical Sciences Building fax: (530) 752-7099 One Shields Ave. University of California at Davis Davis, CA 95616, USA -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.2 (Darwin) iD8DBQFEGrvp9p/Jzwa2QP4RAuoFAJ4ouFY/G21sWkw8fY/MCPc5GantdACdGFjE xWGG+UGbxs1sTKN5o1+j69A= =Isbq -----END PGP SIGNATURE-----