diff --git a/DESCRIPTION b/DESCRIPTION
index be663d935bde0c7c77b4ff346273e282988aef9c..19d80bd5af20912fa082740085d97a683ec0de4d 100644
--- a/DESCRIPTION
+++ b/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: gmGeostats
-Version: 0.11.1
-Date: 2022-05-03
+Version: 0.11.3
+Date: 2023-04-04
 Title: Geostatistics for Compositional Analysis
 Authors@R: c(person(given = "Raimon", 
         family = "Tolosana-Delgado", 
@@ -50,7 +50,7 @@ Description: Support for geostatistical analysis of multivariate data,
          The methods mainly follow Tolosana-Delgado, Mueller and van den
          Boogaart (2018) <doi:10.1007/s11004-018-9769-3>.
 License: CC BY-SA 4.0 | GPL (>=2)
-URL: https://gitlab.hzdr.de/geomet/gmGeostats
+URL: https://codebase.helmholtz.cloud/geomet/gmGeostats
 Copyright: (C) 2020 by Helmholtz-Zentrum Dresden-Rossendorf and Edith Cowan University;
          gsi.DS code by Hassan Talebi
 RoxygenNote: 7.1.1
diff --git a/NAMESPACE b/NAMESPACE
index 9dca9b7de7bf205588d1f37293a6d74a6f7c2803..84575aae17b144c199c41b53b2fe7b0fb078ab0c 100644
--- a/NAMESPACE
+++ b/NAMESPACE
@@ -325,3 +325,4 @@ useDynLib(gmGeostats,CMVTurningBands)
 useDynLib(gmGeostats,CcalcCgram)
 useDynLib(gmGeostats,anaBackwardC)
 useDynLib(gmGeostats,anaForwardC)
+useDynLib(gmGeostats,getUnitvecR)
diff --git a/NEWS.md b/NEWS.md
index a0d109aee7fdd0ee1d19f2c8e16df89edbfeb74f..3360145e713670774e6357cab65de65b78139500 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -1,3 +1,14 @@
+# gmGeostats 0.11.3
+
+* (2023-04-04) C routines called as symbols and not name strings; non-ASCII characters removed
+
+# gmGeostats 0.11.2
+
+* (2022-11-22) flag '-lstdc++' removed from makevars.
+* (2022-11-15) minor bugs in accuracy calculations corrected. References improved.
+* (2022-08-15) bug in turning bands for spherical variogram corrected.
+
+
 # gmGeostats 0.11.1
 
 * (2022-05-03) minor changes to adapt to more stringent standards for class determination, for variable definition in C code, and for calls to FORTRAN routines from C code involving string arguments.
diff --git a/R/Anamorphosis.R b/R/Anamorphosis.R
index 61ea31a891940688c64559f17242af1de017ac02..e6a81df0929ef2a2d46fd24d173a634e10debe80 100644
--- a/R/Anamorphosis.R
+++ b/R/Anamorphosis.R
@@ -145,7 +145,7 @@ anaForward <- function(x,Y,sigma0,sigma1=1+sigma0,steps=30,plt=FALSE,sphere=TRUE
   #h=1/steps
   #if( plt )
   #    plot(x[,1:2],pch=".")
-  erg<-.C("anaForwardC",
+  erg<-.C(anaForwardC,
      dimX=checkInt(dim(x),2),
      x=checkDouble(x),
      dimY=checkInt(dim(Y),2),
@@ -228,7 +228,7 @@ anaBackward <- function(x,Y,sigma0,sigma1=1+sigma0,steps=30,plt=FALSE,sphere=TRU
   }
   Y<-t(st(Y))
   x <- t(x)
-  erg<-.C("anaBackwardC",
+  erg<-.C(anaBackwardC,
      dimX=checkInt(dim(x),2),
      x=checkDouble(x),
      dimY=checkInt(dim(Y),2),
diff --git a/R/accuracy.R b/R/accuracy.R
index 4a2e6cfe184522c708cea47518a5ee392b0f907d..eb13e61423d34af16676571d530b100d4a31f06b 100644
--- a/R/accuracy.R
+++ b/R/accuracy.R
@@ -5,7 +5,7 @@
 #' Computes goodness-of-fit measures (accuracy, precision and joint goodness) adapted or extended from the 
 #' definition of Deutsch (1997).
 #'
-#' @param x data container for the predictions (plus cokriging error variances/covariance) or simulations (and eventually for the true values in univariate problems)
+#' @param object data container for the predictions (plus cokriging error variances/covariance) or simulations (and eventually for the true values in univariate problems)
 #' @param ... generic functionality, currently ignored
 #'
 #' @return If `outMahalanobis=TRUE` (the primary use), this function returns a two-column dataset of class
@@ -13,18 +13,18 @@
 #' the actual coverage of the intervals of each of these probabilities. If `outMahalanobis=FALSE`, the output
 #' is a vector (for prediction) or matrix (for simulation) of Mahalanobis error norms.
 #' 
-#' @details For method "kriging", `x` must contain columns with names including the string "pred" for predictions
+#' @details For method "kriging", `object` must contain columns with names including the string "pred" for predictions
 #' and "var" for the kriging variance; the observed values can also be included as an extra column with name "observed",
-#' or else additionally provided in argument `observed`. For method "cokriging", the columns of `x` must contain
+#' or else additionally provided in argument `observed`. For method "cokriging", the columns of `object` must contain
 #' predictions, cokriging variances and cokriging covariances in columns including the strings "pred", "var" resp. "cov",
 #' and observed values can only be provided via `observed` argument. Note that these are the natural formats when
 #' using [gstat::predict.gstat()] and other (co)kriging functions of that package.
 #' 
 #' For univariate and multivariate cokriging results (methods "kriging" and "cokriging"), the coverage values are computed based on the
 #' Mahalanobis square error, the (square) distance between prediction and true value, using as the positive definite bilinear form 
-#' of the distance the variance-covariance cokriging matrix. The rationale is that, under the  assumption
-#' that the random field is Gaussian, the distribution of this Mahalanobis square error is known
-#' to follow a \eqn{\chi^2(\nu)} with degrees of freedom \eqn{\nu} equal to the number of variables. Having this
+#' of the distance the variance-covariance cokriging matrix. The rationale is that, under the assumption
+#' that the random field is Gaussian, the distribution of this Mahalanobis square error should
+#' follow a \eqn{\chi^2(\nu)} with degrees of freedom \eqn{\nu} equal to the number of variables. Having this
 #' reference distribution allows us to compute confidence intervals for that Mahalanobis square error, and then 
 #' count how many of the actually observed errors are included on each one of the intervals (the *coverage*).
 #' For a perfect adjustment to the distribution, the plot of coverage vs. nominal confidence (see [plot.accuracy])
@@ -33,9 +33,9 @@
 #' from the standard normal distribution appearing by normalizing the residual with the kriging variance; the result is the
 #' same. 
 #' 
-#' For method "simulation" and object `x` is a data.frame, the variable names containing the realisations must 
-#' contain the string "sim", and `observed` must be a vector with as many elements as rows has `x`. If 
-#' `x` is a [DataFrameStack()], then it is assumed that the stacking dimension is running through the realisations;
+#' For method "simulation" and object `object` is a data.frame, the variable names containing the realisations must 
+#' contain the string "sim", and `observed` must be a vector with as many elements as rows has `object`. If 
+#' `object` is a [DataFrameStack()], then it is assumed that the stacking dimension is running through the realisations;
 #' the true values must still be given in `observed`.
 #' In both cases, the method is based on ranks:
 #' with them we can calculate which is the frequency of simulations being more extreme than the observed value.
@@ -43,7 +43,7 @@
 #' for each location separately. 
 #' 
 #' Method "mahalanobis" ("Mahalanobis" also works) is the analogous for multivariate simulations. It
-#' only works for `x` of class [DataFrameStack()], and requires the stacking dimension to run through
+#' only works for `object` of class [DataFrameStack()], and requires the stacking dimension to run through
 #' the realisations and the other two dimensions to coincide with the dimensions of `observed`, i.e.
 #' giving locations by rows and variables by columns. In this case, a covariance matrix will be computed
 #' and this will be used to compute the Mahalanobis square error defined above in method "cokriging": 
@@ -59,7 +59,12 @@
 #' 
 #' @export
 #' @family accuracy functions
-accuracy <- function(x,...) UseMethod("accuracy", x)
+#' @references Mueller, Selia and Tolosana-Delgado (2023) Multivariate cross-validation
+#' and measures of accuracy and precision. 
+#' Mathematical Geosciences (under review).
+#' 
+#' 
+accuracy <- function(object,...) UseMethod("accuracy", object)
 
 
 
@@ -68,19 +73,20 @@ accuracy <- function(x,...) UseMethod("accuracy", x)
 #' @param observed either a vector- or matrix-like object of the true values
 #' @param prob sequence of cutoff probabilities to use for the calculations
 #' @param method which method was used for generating the predictions/simulations? 
-#' one of c("kriging", "cokriging", "simulation") for `x` of class "data.frame", or of
-#' c("simulation", "mahalanobis", "flow") for `x` of class [DataFrameStack()].
+#' one of c("kriging", "cokriging", "simulation") for `object` of class "data.frame", or of
+#' c("simulation", "mahalanobis", "flow") for `object` of class [DataFrameStack()].
 #' @param outMahalanobis if TRUE, do not do the final accuracy calculations and return the Mahalanobis
 #' norms of the residuals; if FALSE do the accuracy calculations
 #' @param ivar if `method`="kriging" or "cokriging" you can also specify here one single variable name 
-#' to consider for univariate accuracy; this variable name must exist both in `x` 
+#' to consider for univariate accuracy; this variable name must exist both in `object` 
 #' (including "pred" and "var" prefixes or suffixes in the column names) and in `observed`;
 #' this might require renaming the columns of these files!
 #' @export
-accuracy.data.frame <- function(x, observed=x$observed, 
+accuracy.data.frame <- function(object, observed=object$observed, 
                                 prob = seq(from=0, to=1, by=0.05),
                                 method="kriging", outMahalanobis=FALSE, 
                                 ivar, ...){
+  x = object # after v 0.11.9003, 'accuracy' first argument is renamed to 'object' for compatibility with tidymodels 
   methods = c("kriging", "cokriging", "simulation")
   mm = methods[pmatch(method, methods)]
   if(!missing(ivar)){
@@ -88,7 +94,7 @@ accuracy.data.frame <- function(x, observed=x$observed,
     iPred = intersect(grep(ivar, colnames(x)), grep("pred", colnames(x)))
     iVar = intersect(grep(ivar, colnames(x)), grep("var", colnames(x)))
     if(any(sapply(list(iTrue, iPred, iVar), length)!=1))
-      stop("accuracy: univariate case by specifying an `ivar` requires the named variable to occur ONCE in `observed` and once in `x`, here prefixed or suffixed with `pred` and `var` to identify kriging predictions and variances")
+      stop("accuracy: univariate case by specifying an `ivar` requires the named variable to occur ONCE in `observed` and once in `object`, here prefixed or suffixed with `pred` and `var` to identify kriging predictions and variances")
     mm = "kriging"
     observed = observed[,iTrue]
     x = x[, c(iPred, iVar)]
@@ -125,7 +131,7 @@ accuracy.data.frame <- function(x, observed=x$observed,
   if(outMahalanobis)
     return(mynorms)
   # cases for kriging and cokriging
-  qqq = stats::qchisq(prob,df=D)
+  qqq = stats::qchisq(prob,df=D)   # TODO: this could be Hotelling's T^2 distributed? 
   aa = outer(mynorms,qqq,"<=")
   a = colMeans(aa)
   erg = data.frame(p=prob, accuracy=a)
@@ -139,19 +145,20 @@ accuracy.data.frame <- function(x, observed=x$observed,
 #' @param ivars in multivariate cases, a vector of names of the variables to analyse (or one single variable name)
 #' @method accuracy DataFrameStack
 #' @export
-accuracy.DataFrameStack <- function(x, observed, 
-                                    ivars = intersect(colnames(observed), dimnames(x)[[noStackDim(x)]]),
+accuracy.DataFrameStack <- function(object, observed, 
+                                    ivars = intersect(colnames(observed), dimnames(object)[[noStackDim(object)]]),
                                     prob = seq(from=0, to=1, by=0.05),
                                     method = ifelse(length(ivars)==1, "simulation", "Mahalanobis"),
                                     outMahalanobis=FALSE, ...){
+  x = object  # after v 0.11.9003, 'accuracy' first argument is renamed to 'object' for compatibility with tidymodels 
   methods = c("simulation", "Mahalanobis","mahalanobis", "flow")
   mm = methods[pmatch(method, methods)]
   oneAcc.sim = function(sims, true){
-    rks = rank(c(true,sims))
+    rks = rank(c(true,as.matrix(sims)))
     2*abs(rks[1]/(1+length(sims))-0.5)
   }
   oneAcc.assim = function(sims, true){
-    rks = rank(c(true,sims))
+    rks = rank(c(true,as.matrix(sims)))
     rks[1]/(1+length(sims))
   }
   
@@ -160,7 +167,7 @@ accuracy.DataFrameStack <- function(x, observed,
       warning("accuracy: outMahalanobis=TRUE not valid with method='simulation'")
     if(nrow(x)!=length(observed))
       if(nrow(x)!=nrow(observed))
-        stop("accuracy: dimensions of x and observed do not match")
+        stop("accuracy: dimensions of `object` and `observed` do not match")
     sims = as.matrix(gmApply(x, FUN=function(xx)xx[,ivars, drop=FALSE]))
     res = sapply(1:nrow(sims), function(i) oneAcc.sim(sims[i,], observed[i, ivars, drop=FALSE]))
     aa = outer(res, prob, "<=")
@@ -406,7 +413,7 @@ xvErrorMeasures.data.frame = function(x, observed=x$observed, output="MSDR1",
     preds = sapply(unclass(xreord), cbind)
     obs = as.matrix(observed)
     resids = preds-obs
-    if(output=="ME") return(mean(resids, na.rm=TRUE))
+    if(output=="ME") return(colMeans(resids, na.rm=TRUE))
     if(output=="MSE") return(mean(rowSums(resids^2, na.rm=T), na.rm=TRUE))
     if(output=="MSDR2"){
       myvar = function(j) covs[,j,j]
diff --git a/R/compositionsCompatibility.R b/R/compositionsCompatibility.R
index 0846f3bf4ffcebbe1b4ba900e4cff3e0434b5486..551bcc0d0779b20d8485dd558904845fc0f2b32c 100644
--- a/R/compositionsCompatibility.R
+++ b/R/compositionsCompatibility.R
@@ -36,7 +36,7 @@ logratioVariogram <- function(data, ...) UseMethod("logratioVariogram", data)
 #' @param comp an alias for `data`, provided for back-compatibility with compositions::logratioVariogram 
 logratioVariogram.default <- function(data, loc, ..., comp=data){
   res = try(compositions::logratioVariogram(comp=acomp(comp), loc=loc, ...), silent=TRUE)
-  if(class(res)!="try-error") return(res)
+  if(!inherits(res, what="try-error", which=FALSE)) return(res)
   res = compositions::logratioVariogram(data=acomp(comp), loc=loc, ...)
 }
   
diff --git a/R/data.R b/R/data.R
index 68f8e80a9fff3d6cb7a0f3c06dc11b262810bd4b..707db276150505ad1d24b5549026f786b7f5c8fc 100644
--- a/R/data.R
+++ b/R/data.R
@@ -13,8 +13,8 @@
 #' * The samples were complemented with information about their belonging to one of the  
 #' major crustal blocks (MCB) of Australia.
 #' * Easting and Northing coordinates were computed using the Lambert conformal conic projection of 
-#' Australia (earth ellipsoid GRS80; standard parallels: 18°S and 36ºS latitude; central meridian: 
-#' 134ºE longitude).  
+#' Australia (earth ellipsoid GRS80; standard parallels: 18S and 36S latitude; central meridian: 
+#' 134E longitude).  
 #'
 #'
 #'#' @format A tibble, a data set of compound class c("tbl_df", "tbl", "data.frame") with 5259 
@@ -32,7 +32,7 @@
 #'   \item{REGION}{One of the three geo-regions of the data set: "EAST", "WEST" or "EUCLA"}
 #'   \item{DUPLICATE CODE}{Marker for duplicate samples, for quality control}
 #'   \item{SAMPLEID}{ID of the sample}
-#'   \item{GRAIN SIZE}{Particle size of the soil sample, one of "<2 mm" (coarse) or "<75 µm" (fine)}
+#'   \item{GRAIN SIZE}{Particle size of the soil sample, one of "<2 mm" (coarse) or "<75 um" (fine)}
 #'   \item{DEPTH}{Sampling depth, one of: "TOS" (top soil) or "BOS" (bottom soil)}
 #'   \item{CODE}{A combination of `Grain Size` and `DEPTH`, one of "Tc" "Tf" "Bc" "Bf", standing for TOS coarse, TOS fine, BOS coarse and BOS fine, respectively}
 #'   \item{`Ag ICP-MS mg/kg 0.03`}{concemtration of Silver in that sample, analysed with inductive coupled plasma mass spectrometry (ICP-MS), with a detection limit of 0.03ppm}
diff --git a/R/exploratools.R b/R/exploratools.R
index 9d6746ac9e00aaff436a5ff2a08c7abc041f5c00..468f7fe63fb45020d71445d4d1bda7ee01b47547 100644
--- a/R/exploratools.R
+++ b/R/exploratools.R
@@ -1,593 +1,593 @@
-#############################################################################
-#####     FUNCTIONS FOR SPATIAL COMPOSITIONAL EXPLORATORY ANALYSIS     ######
-#############################################################################
-
-
-
-#' Check presence of missings
-#' check presence of missings in a data.frame
-#' @param x data, of class data.frame
-#' @param mc not used
-#' @param ... not used
-#'
-#' @return A single true/false response about the presence of any missing value 
-#' on the whole data set
-#' @export
-#' @importFrom compositions has.missings
-#' @method has.missings data.frame
-#'
-#' @examples
-#' library(compositions)
-#' data(Windarling)
-#' has.missings(Windarling)
-#' head(Windarling)
-#' Windarling[1,1] = NA
-#' head(Windarling)
-#' has.missings(Windarling)
-has.missings.data.frame = function(x, mc=NULL, ...){
-  if(is.null(x))
-    return(FALSE)
-  (!is.null(x)) && any(is.na(x))
-}
-
-
-
-
-#' Compositional maps, pairwise logratios
-#' Matrix of maps showing different combinations of components of a composition, in pairwise logratios
-#' @param loc matrix or data.frame of coordinates of the sample locations
-#' @param comp composition observed at every location, can be a matrix, a data.frame or
-#'  of one of the classes \code{compositions::acomp} or \code{compositions::aplus}
-#' @param colscale set of colors to be used as colorscale (defauls to 10 colors between blue and red)
-#' @param cexrange symbol size min and max values (default to 0.1 to 2)
-#' @param scale function scaling the set of z-values of each map, defaults to \code{\link{rank}}
-#' @param commonscale logical, should all plots share a common z-scale? defaults to FALSE
-#' @param foregroundcolor color to be used for the border of the symbol
-#' @param closeplot logical, should the plot be left open (FALSE) for further changes, or be frozen (TRUE)? 
-#' defaults to TRUE
-#'
-#' @return The function is primarily called for producing a matrix of (D,D) plots of the D-part 
-#' compositional samples, where at each plot we represent a map whose symbols are colored and 
-#' sized according to a z-scale controlled by a different logratio. For each plot, this is the 
-#' logratio of the row variable by the column variable. However, in case that `closeplot=FALSE`, 
-#' this function returns 
-#' invisibly the graphical parameters that were active prior to calling this function. This allows 
-#' the user to add further stuff to the plots (mostly, using \code{par(mfg=c(i,j))} to plot on the
-#' diagram (i,j)), or manually freeze the plot (by wrapping the call to \code{pwlrmap} on a call to 
-#' \code{\link{par}}).
-#' @export
-#' @importFrom graphics plot text mtext
-#'
-#' @examples
-#' 
-#' data("Windarling")
-#' coords = as.matrix(Windarling[,c("Easting","Northing")])
-#' compo = Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")]
-#' compo$Rest = 100-rowSums(compo)
-#' compo = compositions::acomp(compo) 
-#' # in quantiles (default, ranking controls color and size)
-#' pwlrmap(coords, compo) 
-#' \donttest{
-#' # in logratios (I=identity)
-#' pwlrmap(coords, compo, scale=I)
-#' # in ratios (i.e., apply exp)
-#' pwlrmap(coords, compo, scale=exp)  
-#' # use only color, no change in symbol size
-#' pwlrmap(coords, compo, cexrange=c(1,1)) 
-#' # change all
-#' pwlrmap(coords, compo, commonscale=TRUE, cexrange=c(1.2,1.2), 
-#'                     colscale=rev(rainbow(40, start=0, end=4/6))) 
-#' }
-pwlrmap = function(loc,   # XY coordinates (matrix or data frame)
-                   comp,  # composition (matrix, acomp, aplus or data.frame)
-                   colscale=rev(rainbow(10, start=0, end=4/6)), # color scale (defauls to 10 colors between blue and red)
-                   cexrange=c(0.1, 2),  # symbol size min and max values (default to 0.1 to 2)
-                   scale=rank, # scaling FUNCTION (defaults to ranking, i.e. quantiles)
-                   commonscale=FALSE,  # should all plots be generated with a common scale?
-                   foregroundcolor="black",
-                   closeplot = TRUE
-                   ){
-  # set of maps where the symbols are chosen according to each possible pwlr, in 
-  # a scale given by the user
-  opar = par()
-  opar = par_remove_readonly(opar)
-  
-  if(closeplot) on.exit(par(opar))
-  # dimensions
-  D = ncol(comp)
-  N = nrow(loc)
-  # compute pwlrs
-  ij = expand.grid(i=1:D, j=1:D) # pairs of indices
-  mat = matrix(0, nrow=D, ncol=D*D)   # matrix of +1 numerator, -1 denominator
-  for(k in 1:nrow(ij)){
-    mat[ij[k,1],k]=-1
-    mat[ij[k,2],k]=1
-  }
-  Zpwlr = as.matrix(log(unclass(comp))) %*% mat
-  # scale the variables
-  if(commonscale){
-    sizes  = scale(c(unlist(Zpwlr)))
-    dim(sizes) = dim(Zpwlr)
-    # calculate common cutting levels
-    commonbks = seq(from=min(sizes[is.finite(sizes)]), to=max(sizes[is.finite(sizes)]), length.out=length(colscale))
-    dfbks = diff(commonbks)[1]
-    commonbks = c(commonbks[1]-dfbks, commonbks+0.5*dfbks)
-  }else{
-    sizes = gmApply(Zpwlr,2,scale)    
-  }
-  # do the plot
-  par(mfrow=c(D,D), mar=c(1,1,1,1)/5, oma=c(3,5,5,3))
-  for(i in 1:D){
-    for(j in 1:D){
-      if(i==j){
-        # diagonal plots, only labels
-        plot(loc, type="n", xaxt=ifelse(i==D,"s","n"), yaxt=ifelse(j==1,"s","n") )
-        text(mean(range(loc[,1])), mean(range(loc[,2])), colnames(comp)[i], cex=2)
-      }else{
-      # off-diagonal plots, maps  
-      k = (i-1)*D+j        
-      sz = sizes[,k]  
-      # choose which breaks to use, the common ones or the particular ones
-      if(commonscale){
-        bks = commonbks
-      }else{
-       bks = seq(from=min(sz[is.finite(sz)]), to=max(sz[is.finite(sz)]), length.out=length(colscale))
-         dfbks = diff(bks)[1]
-       bks = c(bks[1]-dfbks, bks+0.5*dfbks)
-      }
-      # compute colors and sizes
-      cols = colscale[cut(sz, bks)]
-      sz = cexrange[1]+(cexrange[2]-cexrange[1])*(sz-bks[1])/c(range(bks) %*% c(-1,1))
-      # actual map
-      plot(loc, cex=sz, bg=cols, pch=21, asp=1, col=foregroundcolor,
-            xaxt=ifelse(i==D,"s","n"),yaxt=ifelse(j==1,"s","n")
-           )
-      }
-      # add axes on extra sides
-      if(i==1) axis(side=3)
-      if(j==D) axis(side=4)
-    }
-  }
-  # add labels
-  mtext("numerator", side=2, outer=TRUE, line=2.5, cex=1.25)
-  mtext("denominator", side=3, outer=TRUE, line=2.5, cex=1.25)
-  # return the old graphical parameters to freeze the plot
-  invisible(opar)
-}
-
-
-
-
-
-
-#' Multiple maps
-#' Matrix of maps showing different combinations of components of a composition, user defined
-#' 
-#'
-#' @param data data to represent; admits many data containing objects 
-#' (matrix, data.frame, classes from package \code{compositions}) as well 
-#' as their \code{Spatial} counterparts (in which case, \code{loc} is not necessary)
-#' @param ... extra arguments for generic functionality
-#'
-#' @return The function is primarily called for producing a matrix of plots of each component of a 
-#' multivariate data set, such as a compositional data set. Each plot represents a map whose symbols 
-#' are colored and sized according to a z-scale controlled by one of the variables of the data set. 
-#' It can be used virtually with any geometry, any kind of data (compositional, positive, raw, etc). 
-#' This function returns invisibly the graphical parameters that were active prior to calling this 
-#' function. This allows the user to add further stuff to the plots (mostly, using \code{par(mfg=c(i,j))} 
-#' to plot on the diagram (i,j)), or else freeze the plot (by wrapping the call to \code{pwlrmap} 
-#' on a call to \code{\link{par}}).
-#' @export
-#' @importFrom grDevices rainbow
-#' @importFrom graphics par plot
-#'
-#' @examples
-#' data("Windarling")
-#' library("compositions")
-#' coords = as.matrix(Windarling[,c("Easting","Northing")])
-#' compo = Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")]
-#' compo$Rest = 100-rowSums(compo)
-#' compo = acomp(compo)
-#' pairsmap(data=clr(compo), loc=coords) # clr
-#' pairsmap(data=compo, loc=coords) # closed data
-pairsmap <- function(data, ...) UseMethod("pairsmap", data)
-
-
-#' @describeIn pairsmap Multiple maps
-#' @method pairsmap SpatialPointsDataFrame
-pairsmap.SpatialPointsDataFrame <- function(data, ...){
-  pairsmap.default(data@data, loc=sp::coordinates(data), ...)
-}
-
-
-#' @describeIn pairsmap Multiple maps
-#' @method pairsmap default
-#' @export
-#' @param loc matrix or data.frame of coordinates of the sample locations
-#' @param colscale set of colors to be used as colorscale (defauls to 10 colors between blue and red)
-#' @param cexrange symbol size min and max values (default to 0.1 to 2)
-#' @param scale function scaling the set of z-values of each map, defaults to \code{\link{rank}}
-#' @param commonscale logical, should all plots share a common z-scale? defaults to FALSE
-#' @param mfrow vector of two integers, giving the number of plots per row and per column of the 
-#' matrix of plots to be created; defaults to something square-ish, somewhat wider than longer, and able to
-#' contain all plots
-#' @param foregroundcolor color to be used for the border of the symbol
-#' @param closeplot logical, should the plot be left open (FALSE) for further changes, or be frozen (TRUE)? 
-#' defaults to TRUE
-pairsmap.default <- function(data,   # data to represent
-                    loc,   # XY coordinates (matrix or data frame)
-                   colscale=rev(rainbow(10, start=0, end=4/6)), # color scale (defauls to 10 colors between blue and red)
-                   cexrange=c(0.1, 2),  # symbol size min and max values (default to 0.1 to 2)
-                   scale=rank, # scaling FUNCTION (defaults to ranking, i.e. quantiles)
-                   commonscale=FALSE,  # should all plots be generated with a common scale?
-                   mfrow = c(floor(sqrt(ncol(data))), ceiling(ncol(data)/floor(sqrt(ncol(data))))),# automatic distribution of figs
-                   foregroundcolor = "black", 
-                   closeplot=TRUE,
-                   ...
-){
-  opar = par()
-  opar = par_remove_readonly(opar)
-  if(closeplot) on.exit(par(opar))
-  # dimensions
-  D = ncol(data)
-  N = nrow(loc)
-  # scale the variables
-  if(commonscale){
-    sizes  = scale(c(unlist(data)))
-    dim(sizes) = dim(data)
-    # calculate common cutting levels
-    commonbks = seq(from=min(sizes[is.finite(sizes)]), to=max(sizes[is.finite(sizes)]), length.out=length(colscale))
-    dfbks = diff(commonbks)[1]
-    commonbks = c(commonbks[1]-dfbks, commonbks+0.5*dfbks)
-  }else{
-    sizes = gmApply(data,2,scale)    
-  }
-  # do the plot
-  par(mfrow=mfrow, mar=c(1,1,10,1)/5, oma=c(3,3,3,3))
-  drow = mfrow[1]
-  dcol = mfrow[2]
-  for(i in 1:drow){
-    for(j in 1:dcol){
-        # off-diagonal plots, maps  
-        k = (i-1)*dcol+j        
-        if(k<=D){
-         sz = sizes[,k]  
-         # choose which breaks to use, the common ones or the particular ones
-         if(commonscale){
-           bks = commonbks
-         }else{
-            bks = seq(from=min(sz[is.finite(sz)]), to=max(sz[is.finite(sz)]), length.out=length(colscale))
-           dfbks = diff(bks)[1]
-           bks = c(bks[1]-dfbks, bks+0.5*dfbks)
-         }
-         # compute colors and sizes
-         cols = colscale[cut(sz, bks)]
-         sz = cexrange[1]+(cexrange[2]-cexrange[1])*(sz-bks[1])/c(range(bks) %*% c(-1,1))
-         # actual map
-         plot(loc, cex=sz, bg=cols, pch=21, asp=1, main=colnames(data)[k], col=foregroundcolor,
-             xaxt=ifelse(i==drow,"s","n"),yaxt=ifelse(j==1,"s","n")
-         )
-        } 
-       # add axes on extra sides
-       # if(i==1) axis(side=3)
-       if(j==dcol) axis(side=4)
-    }
-  }
-  # return the old graphical parameters
-  #par(mfrow=opar$mfrow, mar=opar$mar, oma=opar$oma)
-  invisible(opar)
-}
-
-
-
-#' Spectral colors palette
-#' based on the RColorBrewer::brewer.pal(11,"Spectral")
-#'
-#' @param n number of colors 
-#'
-#' @return a palette, i.e. a list of colors, from dark blue to dark red over pale yellow.
-#' @export
-#' @importFrom grDevices colorRampPalette
-#' @import RColorBrewer
-#'
-#' @examples
-#' (cls=spectralcolors(20))
-spectralcolors <- function(n){
-  cls = RColorBrewer::brewer.pal(min(11,n), "Spectral")
-  if(n>11){
-    cls = grDevices::colorRampPalette(cls)(n)
-  }
-  return(rev(cls))
-}
-
-# #### MOVED TO COMPOSITIONS #####
-# Compositional panel 1D-density plot
-# Panel minifunction for plotting histograms and kernel densities of the data 
-#
-# @param x ignored (here for compatibility with \code{\link{qqnorm.acomp}})
-# @param y numeric vector of response values
-# @param col color of the plot
-# @param ... further parameters to plotting functions, currently ignored
-# @param alpha alpha level at which the test should be done
-#
-# @return If given to the argument \code{panel} of a function such as \code{\link{qqnorm.acomp}}),
-# this produces a matrix of plots where each panel contains a histogram and a kernel density 
-# overdrawn. If the distribution of this data is accepted to be normal at the \code{alpha}-level
-# by a \code{\link{shapiro.test}}, then the histogram is painted with the \code{col}or provided; 
-# otherwise the histogram bars are empty, but the kernel density curve is colored.
-#
-# @examples
-# data("Windarling")
-# compo = as.matrix(Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")])
-# qqnorm.acomp(compo, panel=vp.lrdensityplot, alpha=0.05) 
-# #### MOVED TO COMPOSITIONS #####
-
-
-
-
-
-# Panel function for 2D-density plots
-# Panel minifunction for plotting 2D kernel densities of the data 
-#
-# @param x numeric vector of response values (axis X)
-# @param y numeric vector of response values (axis Y)
-# @param xaxt style of the X axis labelling (defaults to "n", none)
-# @param yaxt style of the Y axis labelling (defaults to "n", none)
-# @param grid logical, should a unit grid be drawn? defaults to TRUE
-# @param legpos string, position of the correlation coefficient, defaults to "bottomright"
-# @param add logical, should the output be added to an existing diagram? defaults to TRUE, as required for acting as a panel diagram
-# @param colpalette color palette for the image (defaults to spectral colors: blue-yellow-red)
-# @param ... further parameters to image
-#
-# @return If given to the argument \code{panel} of a function such as \code{\link{pairs}}),
-# this produces a matrix of images where each panel contains a 2D kernel density map,
-# using blue for low density regions and dark red for high density colors. 
-# Regression lines (y~x) and correlation coefficients are added as well.
-# @export
-#
-# @examples
-# data("Windarling")
-# compo = Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")]
-# pairs(clr(compo), panel=vp.kde2dplot)
-# vp.kde2dplot = 
-#   function(x, y, xaxt="n", yaxt="n",
-#            grid=TRUE,legpos="bottomright", add=TRUE, colpalette=spectralcolors,...){
-#     aux = MASS::kde2d(x, y, n=50)
-#     aux$z = sqrt(aux$z)
-#     bks = hist(aux$z, plot=F, breaks=20)$breaks
-#     cols = c(NA,colpalette(length(bks)-2))
-#     image(aux, breaks = bks, col=cols, xlab="", ylab="", xaxt=xaxt, yaxt=yaxt,add=add, ...) #yaxt=ifelse(j==1,"s","n")
-#     xgrid = seq(from=floor(min(x)), to = ceiling(max(x)), by=1)
-#     ygrid = seq(from=floor(min(y)), to = ceiling(max(y)), by=1)
-#     abline(lm(y~x), col=2, lwd=2)
-#     if(grid)abline(v=xgrid, h=ygrid, col="#999999")
-#     legend(legpos, legend=round(cor(x,y), dig=3), bg="#999999")
-#   }
-
-
-
-
-
-#' Swath plots
-#' 
-#' Plots of data vs. one spatial coordinate, with local average spline curve
-#'
-#' @param data data to be represented, compositional class, data.frame, or
-#' spatial data object (in which case, \code{loc} is a formula!)
-#' @param ... further arguments to panel plots
-#'
-#' @return Nothing, as the function is primarily called to produce a plot
-#' @export
-#' @importFrom stats loess
-#' @importFrom graphics plot par text lines mtext
-#'
-#' @examples
-#' \donttest{
-#' data("Windarling")
-#' library("sp")
-#' compo = compositions::acomp(Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")])
-#' Northing = Windarling$Northing
-#' swath(compo, Northing)
-#' wind.spdf = sp::SpatialPointsDataFrame(sp::SpatialPoints(Windarling[,6:7]), 
-#'      data=compo)
-#' swath(wind.spdf, loc=Northing)
-#' }
-swath <- function(data, ...) UseMethod("swath", data)
-
-
-
-#' @describeIn swath swath plot default method
-#' @export
-#' @method swath default
-#' @param loc a numeric vector with the values for one coordinate
-#' @param pch symbol to be used for the individual points, defaults to "."
-#' @param withLoess either logical (should a loess line be added?) or a list 
-#' of arguments to DescTools::lines.loess
-#' @param commonScale logical or NA: should all plots share the same vertical 
-#' range? FALSE=no, TRUE=yes (default), for
-#' compositional data sets, the value NA (=all plots within a row) is also 
-#' permitted and is actually the default
-#' @param xlab label for the common x axis (defaults to a deparsed version 
-#' of loc)
-#' @param mfrow distribution of the several plots; it has a good internal default 
-#' (not used for compositional 
-#' classes)
-swath.default <- function(data,  # data (matrix, rmult, aplus, rplus or data.frame)
-                           loc,   # X or Y coordinates (a vector)
-                           pch=".",
-                           withLoess=TRUE, # either logical (should a loess line be added?) or a list of arguments to lines.loess
-                           commonScale=TRUE, # logical: should all plots share the same vertical range? FALSE=no, TRUE=yes
-                           xlab = deparse(substitute(loc)),
-                           ..., # extra arguments to plot
-                           mfrow # automatic distribution of figs
-){
-  # set of swath plots for each possible pwlr, eventually with a loess line 
-  if(missing(mfrow)) mfrow = c(floor(sqrt(ncol(data))), ceiling(ncol(data)/floor(sqrt(ncol(data)))))
-  if(is(data, "Spatial")) return(swath_SpatialPointsDataFrame(data=data, loc=loc, pch=pch, xlab=xlab, mfrow=mfrow,
-                                                    withLoess=withLoess, commonScale=commonScale, ...))
-  # preparations
-  opar = par()
-  opar = par_remove_readonly(opar)
-  
-  on.exit(par(opar))
-  col0 = spectralcolors(10)[10]
-  
-  # isometric representation
-  if("data.frame" %in% class(data)) data = compositions::rmult(as.matrix(data), V=gsi.getV(data), orig=gsi.orig(data))
-  comp = idt(data)
-  
-  # dimensions
-  D = ncol(comp)
-  N = length(loc)
-  
-  # scale the variables
-  if(is.na(commonScale)){
-    rgY = rep(range(unclass(comp), na.rm=TRUE), D)
-    dim(rgY) = c(2,D)
-  }else{
-    rgY = rep(NA, 2*D) 
-    dim(rgY) = c(2,D)
-  }
-  
-  # do the plot
-  par(mfrow=mfrow, mar=c(1,1,3,1)/5, oma=c(5,5,5,3))
-  for(k in 1:D){
-    ij = par()$mfg
-    i = ij[1]
-    j = ij[2]
-    plot(loc, comp[,k], ylim = range(c(comp[,k], rgY[,i]), na.rm=TRUE), pch = pch,
-         xaxt = ifelse(i==D,"s","n"), yaxt = ifelse(j==1,"s","n"),...)
-    title(main=colnames(comp)[k], line=0.5)
-    if(is.logical(withLoess)){
-      if(withLoess & requireNamespace("DescTools")){
-        a = stats::loess(comp[,k]~loc)
-        DescTools::lines.loess(a, col=col0)
-      }
-    }else if("list" %in% class(withLoess) & requireNamespace("DescTools")){
-      args = withLoess
-      if(!("col" %in% names(args))) args$col = col0
-      args$x = stats::loess(comp[,k]~loc)
-      do.call("DescTools::lines.loess", args=args)
-    }
-    # add axes on extra sides
-    if(i==1) axis(side=3)
-    if(j==D) axis(side=4)
-  }
-  # add labels
-  mtext(xlab, side=1, outer=TRUE, line=2.5, cex=1.25)
-  # return the old graphical parameters to freeze the plot
-  #invisible(opar)
-}
-
-
-#' @describeIn swath  Swath plots for acomp objects
-#' @method swath acomp 
-#' @export
-#' @importFrom compositions acomp
-swath.acomp <- function(data,  # composition (rcomp, acomp, ccomp)
-           loc,   # X or Y coordinates (a vector)
-           pch = ".",
-           withLoess = TRUE, # either logical (should a loess line be added?) or a list of arguments to lines.loess
-           commonScale = NA, # logical or NA:
-           xlab = deparse(substitute(loc)),
-           ... # extra arguments to plot
-  ){
-    # recover byRowCommonScale from commonScale
-    byRowCommonScale = ifelse(is.na(commonScale), TRUE, ifelse(commonScale, NA, FALSE))
-    # set of swath plots for each possible pwlr, eventually with a loess line 
-
-    # preparations
-    opar = par()
-    opar = par_remove_readonly(opar)
-    
-    on.exit(par(opar))
-    col0 = spectralcolors(10)[10]
-    comp = data
-    
-    # dimensions
-    D = ncol(comp)
-    N = length(loc)
-
-    # compute pwlrs
-    ij = expand.grid(i=1:D, j=1:D) # pairs of indices
-    mat = matrix(0, nrow=D, ncol=D*D)   # matrix of +1 numerator, -1 denominator
-    for(k in 1:nrow(ij)){
-      mat[ij[k,1],k]=-1
-      mat[ij[k,2],k]=1
-      if(ij[k,1]==ij[k,2]) mat[ij[k,1],k]=0
-    }
-    Zpwlr = unclass(as.matrix(cdt(comp))) %*% mat
-    
-    # scale the variables
-    if(is.na(byRowCommonScale)){
-      rgY = rep(range(Zpwlr, na.rm=TRUE), 2*D)
-      dim(rgY) = c(2,2*D)
-    }else{
-      if(byRowCommonScale){
-        rgY = sapply(1:D, function(i)  range(Zpwlr[,(D*(i-1)+1:D)[-i]], na.rm=TRUE))
-      }else{
-        rgY = rep(NA, 2*D)
-        dim(rgY) = c(2,D)
-      }
-    }
-    
-    # do the plot
-    par(mfrow=c(D,D), mar=c(1,1,1,1)/5, oma=c(5,5,5,3))
-    for(i in 1:D){
-      for(j in 1:D){
-        k = (i-1)*D+j        
-        if(i==j){
-          # diagonal plots, only labels
-          plot(loc, Zpwlr[,k], type="n", xaxt=ifelse(i==D,"s","n"), yaxt=ifelse(j==1,"s","n") )
-          text(mean(range(loc, na.rm=TRUE)), 0, colnames(comp)[i], cex=2)
-        }else{
-          # off-diagonal plots, swaths  
-          plot(loc, Zpwlr[,k], ylim = range(c(Zpwlr[,k], rgY[,i]), na.rm=TRUE), pch=pch,
-               xaxt=ifelse(i==D,"s","n"), yaxt=ifelse(j==1,"s","n"), ...)
-          if(is.logical(withLoess)){
-            if(withLoess){
-              a = stats::loess(Zpwlr[,k]~loc)
-              DescTools::lines.loess(a, col=col0)
-            }
-          }else if("list" %in% class(withLoess)){
-            args = withLoess
-            if(!("col" %in% names(args))) args$col = col0
-            args$x = stats::loess(Zpwlr[,k]~loc)
-            do.call("DescTools::lines.swath", args=args)
-          }
-        }
-        # add axes on extra sides
-        if(i==1) axis(side=3)
-        if(j==D) axis(side=4)
-      }
-    }
-    # add labels
-    label1 = c("+ part", "numerator") [1+is.acomp(data)]
-    label2 = c("- part", "denominator") [1+is.acomp(data)]
-    mtext(label1, side=2, outer=TRUE, line=2.5, cex=1.25)
-    mtext(label2, side=3, outer=TRUE, line=2.5, cex=1.25)
-    mtext(xlab, side=1, outer=TRUE, line=2.5, cex=1.25)
-    # return the old graphical parameters to freeze the plot
-    #invisible(opar)
-}
-
-#' @describeIn swath Swath plots for ccomp objects
-#' @method swath ccomp
-#' @export
-#' @importFrom compositions ccomp
-swath.ccomp <- swath.acomp
-
-#' @describeIn swath Swath plots for rcomp objects
-#' @method swath rcomp 
-#' @export
-#' @importFrom compositions rcomp
-swath.rcomp <- swath.acomp
-
-
-
-
-swath_SpatialPointsDataFrame <- function(data, loc, ...){
-  swath(data@data, loc=loc, ...)
-}
-
-
+#############################################################################
+#####     FUNCTIONS FOR SPATIAL COMPOSITIONAL EXPLORATORY ANALYSIS     ######
+#############################################################################
+
+
+
+#' Check presence of missings
+#' check presence of missings in a data.frame
+#' @param x data, of class data.frame
+#' @param mc not used
+#' @param ... not used
+#'
+#' @return A single true/false response about the presence of any missing value 
+#' on the whole data set
+#' @export
+#' @importFrom compositions has.missings
+#' @method has.missings data.frame
+#'
+#' @examples
+#' library(compositions)
+#' data(Windarling)
+#' has.missings(Windarling)
+#' head(Windarling)
+#' Windarling[1,1] = NA
+#' head(Windarling)
+#' has.missings(Windarling)
+has.missings.data.frame = function(x, mc=NULL, ...){
+  if(is.null(x))
+    return(FALSE)
+  (!is.null(x)) && any(is.na(x))
+}
+
+
+
+
+#' Compositional maps, pairwise logratios
+#' Matrix of maps showing different combinations of components of a composition, in pairwise logratios
+#' @param loc matrix or data.frame of coordinates of the sample locations
+#' @param comp composition observed at every location, can be a matrix, a data.frame or
+#'  of one of the classes \code{compositions::acomp} or \code{compositions::aplus}
+#' @param colscale set of colors to be used as colorscale (defauls to 10 colors between blue and red)
+#' @param cexrange symbol size min and max values (default to 0.1 to 2)
+#' @param scale function scaling the set of z-values of each map, defaults to \code{\link{rank}}
+#' @param commonscale logical, should all plots share a common z-scale? defaults to FALSE
+#' @param foregroundcolor color to be used for the border of the symbol
+#' @param closeplot logical, should the plot be left open (FALSE) for further changes, or be frozen (TRUE)? 
+#' defaults to TRUE
+#'
+#' @return The function is primarily called for producing a matrix of (D,D) plots of the D-part 
+#' compositional samples, where at each plot we represent a map whose symbols are colored and 
+#' sized according to a z-scale controlled by a different logratio. For each plot, this is the 
+#' logratio of the row variable by the column variable. However, in case that `closeplot=FALSE`, 
+#' this function returns 
+#' invisibly the graphical parameters that were active prior to calling this function. This allows 
+#' the user to add further stuff to the plots (mostly, using \code{par(mfg=c(i,j))} to plot on the
+#' diagram (i,j)), or manually freeze the plot (by wrapping the call to \code{pwlrmap} on a call to 
+#' \code{\link{par}}).
+#' @export
+#' @importFrom graphics plot text mtext
+#'
+#' @examples
+#' 
+#' data("Windarling")
+#' coords = as.matrix(Windarling[,c("Easting","Northing")])
+#' compo = Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")]
+#' compo$Rest = 100-rowSums(compo)
+#' compo = compositions::acomp(compo) 
+#' # in quantiles (default, ranking controls color and size)
+#' pwlrmap(coords, compo) 
+#' \donttest{
+#' # in logratios (I=identity)
+#' pwlrmap(coords, compo, scale=I)
+#' # in ratios (i.e., apply exp)
+#' pwlrmap(coords, compo, scale=exp)  
+#' # use only color, no change in symbol size
+#' pwlrmap(coords, compo, cexrange=c(1,1)) 
+#' # change all
+#' pwlrmap(coords, compo, commonscale=TRUE, cexrange=c(1.2,1.2), 
+#'                     colscale=rev(rainbow(40, start=0, end=4/6))) 
+#' }
+pwlrmap = function(loc,   # XY coordinates (matrix or data frame)
+                   comp,  # composition (matrix, acomp, aplus or data.frame)
+                   colscale=rev(rainbow(10, start=0, end=4/6)), # color scale (defauls to 10 colors between blue and red)
+                   cexrange=c(0.1, 2),  # symbol size min and max values (default to 0.1 to 2)
+                   scale=rank, # scaling FUNCTION (defaults to ranking, i.e. quantiles)
+                   commonscale=FALSE,  # should all plots be generated with a common scale?
+                   foregroundcolor="black",
+                   closeplot = TRUE
+                   ){
+  # set of maps where the symbols are chosen according to each possible pwlr, in 
+  # a scale given by the user
+  opar = par()
+  opar = par_remove_readonly(opar)
+  
+  if(closeplot) on.exit(par(opar))
+  # dimensions
+  D = ncol(comp)
+  N = nrow(loc)
+  # compute pwlrs
+  ij = expand.grid(i=1:D, j=1:D) # pairs of indices
+  mat = matrix(0, nrow=D, ncol=D*D)   # matrix of +1 numerator, -1 denominator
+  for(k in 1:nrow(ij)){
+    mat[ij[k,1],k]=-1
+    mat[ij[k,2],k]=1
+  }
+  Zpwlr = as.matrix(log(unclass(comp))) %*% mat
+  # scale the variables
+  if(commonscale){
+    sizes  = scale(c(unlist(Zpwlr)))
+    dim(sizes) = dim(Zpwlr)
+    # calculate common cutting levels
+    commonbks = seq(from=min(sizes[is.finite(sizes)]), to=max(sizes[is.finite(sizes)]), length.out=length(colscale))
+    dfbks = diff(commonbks)[1]
+    commonbks = c(commonbks[1]-dfbks, commonbks+0.5*dfbks)
+  }else{
+    sizes = gmApply(Zpwlr,2,scale)    
+  }
+  # do the plot
+  par(mfrow=c(D,D), mar=c(1,1,1,1)/5, oma=c(3,5,5,3))
+  for(i in 1:D){
+    for(j in 1:D){
+      if(i==j){
+        # diagonal plots, only labels
+        plot(loc, type="n", xaxt=ifelse(i==D,"s","n"), yaxt=ifelse(j==1,"s","n") )
+        text(mean(range(loc[,1])), mean(range(loc[,2])), colnames(comp)[i], cex=2)
+      }else{
+      # off-diagonal plots, maps  
+      k = (i-1)*D+j        
+      sz = sizes[,k]  
+      # choose which breaks to use, the common ones or the particular ones
+      if(commonscale){
+        bks = commonbks
+      }else{
+       bks = seq(from=min(sz[is.finite(sz)]), to=max(sz[is.finite(sz)]), length.out=length(colscale))
+         dfbks = diff(bks)[1]
+       bks = c(bks[1]-dfbks, bks+0.5*dfbks)
+      }
+      # compute colors and sizes
+      cols = colscale[cut(sz, bks)]
+      sz = cexrange[1]+(cexrange[2]-cexrange[1])*(sz-bks[1])/c(range(bks) %*% c(-1,1))
+      # actual map
+      plot(loc, cex=sz, bg=cols, pch=21, asp=1, col=foregroundcolor,
+            xaxt=ifelse(i==D,"s","n"),yaxt=ifelse(j==1,"s","n")
+           )
+      }
+      # add axes on extra sides
+      if(i==1) axis(side=3)
+      if(j==D) axis(side=4)
+    }
+  }
+  # add labels
+  mtext("numerator", side=2, outer=TRUE, line=2.5, cex=1.25)
+  mtext("denominator", side=3, outer=TRUE, line=2.5, cex=1.25)
+  # return the old graphical parameters to freeze the plot
+  invisible(opar)
+}
+
+
+
+
+
+
+#' Multiple maps
+#' Matrix of maps showing different combinations of components of a composition, user defined
+#' 
+#'
+#' @param data data to represent; admits many data containing objects 
+#' (matrix, data.frame, classes from package \code{compositions}) as well 
+#' as their \code{Spatial} counterparts (in which case, \code{loc} is not necessary)
+#' @param ... extra arguments for generic functionality
+#'
+#' @return The function is primarily called for producing a matrix of plots of each component of a 
+#' multivariate data set, such as a compositional data set. Each plot represents a map whose symbols 
+#' are colored and sized according to a z-scale controlled by one of the variables of the data set. 
+#' It can be used virtually with any geometry, any kind of data (compositional, positive, raw, etc). 
+#' This function returns invisibly the graphical parameters that were active prior to calling this 
+#' function. This allows the user to add further stuff to the plots (mostly, using \code{par(mfg=c(i,j))} 
+#' to plot on the diagram (i,j)), or else freeze the plot (by wrapping the call to \code{pwlrmap} 
+#' on a call to \code{\link{par}}).
+#' @export
+#' @importFrom grDevices rainbow
+#' @importFrom graphics par plot
+#'
+#' @examples
+#' data("Windarling")
+#' library("compositions")
+#' coords = as.matrix(Windarling[,c("Easting","Northing")])
+#' compo = Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")]
+#' compo$Rest = 100-rowSums(compo)
+#' compo = acomp(compo)
+#' pairsmap(data=clr(compo), loc=coords) # clr
+#' pairsmap(data=compo, loc=coords) # closed data
+pairsmap <- function(data, ...) UseMethod("pairsmap", data)
+
+
+#' @describeIn pairsmap Multiple maps
+#' @method pairsmap SpatialPointsDataFrame
+pairsmap.SpatialPointsDataFrame <- function(data, ...){
+  pairsmap.default(data@data, loc=sp::coordinates(data), ...)
+}
+
+
+#' @describeIn pairsmap Multiple maps
+#' @method pairsmap default
+#' @export
+#' @param loc matrix or data.frame of coordinates of the sample locations
+#' @param colscale set of colors to be used as colorscale (defauls to 10 colors between blue and red)
+#' @param cexrange symbol size min and max values (default to 0.1 to 2)
+#' @param scale function scaling the set of z-values of each map, defaults to \code{\link{rank}}
+#' @param commonscale logical, should all plots share a common z-scale? defaults to FALSE
+#' @param mfrow vector of two integers, giving the number of plots per row and per column of the 
+#' matrix of plots to be created; defaults to something square-ish, somewhat wider than longer, and able to
+#' contain all plots
+#' @param foregroundcolor color to be used for the border of the symbol
+#' @param closeplot logical, should the plot be left open (FALSE) for further changes, or be frozen (TRUE)? 
+#' defaults to TRUE
+pairsmap.default <- function(data,   # data to represent
+                    loc,   # XY coordinates (matrix or data frame)
+                   colscale=rev(rainbow(10, start=0, end=4/6)), # color scale (defauls to 10 colors between blue and red)
+                   cexrange=c(0.1, 2),  # symbol size min and max values (default to 0.1 to 2)
+                   scale=rank, # scaling FUNCTION (defaults to ranking, i.e. quantiles)
+                   commonscale=FALSE,  # should all plots be generated with a common scale?
+                   mfrow = c(floor(sqrt(ncol(data))), ceiling(ncol(data)/floor(sqrt(ncol(data))))),# automatic distribution of figs
+                   foregroundcolor = "black", 
+                   closeplot=TRUE,
+                   ...
+){
+  opar = par()
+  opar = par_remove_readonly(opar)
+  if(closeplot) on.exit(par(opar))
+  # dimensions
+  D = ncol(data)
+  N = nrow(loc)
+  # scale the variables
+  if(commonscale){
+    sizes  = scale(c(unlist(data)))
+    dim(sizes) = dim(data)
+    # calculate common cutting levels
+    commonbks = seq(from=min(sizes[is.finite(sizes)]), to=max(sizes[is.finite(sizes)]), length.out=length(colscale))
+    dfbks = diff(commonbks)[1]
+    commonbks = c(commonbks[1]-dfbks, commonbks+0.5*dfbks)
+  }else{
+    sizes = gmApply(data,2,scale)    
+  }
+  # do the plot
+  par(mfrow=mfrow, mar=c(1,1,10,1)/5, oma=c(3,3,3,3))
+  drow = mfrow[1]
+  dcol = mfrow[2]
+  for(i in 1:drow){
+    for(j in 1:dcol){
+        # off-diagonal plots, maps  
+        k = (i-1)*dcol+j        
+        if(k<=D){
+         sz = sizes[,k]  
+         # choose which breaks to use, the common ones or the particular ones
+         if(commonscale){
+           bks = commonbks
+         }else{
+            bks = seq(from=min(sz[is.finite(sz)]), to=max(sz[is.finite(sz)]), length.out=length(colscale))
+           dfbks = diff(bks)[1]
+           bks = c(bks[1]-dfbks, bks+0.5*dfbks)
+         }
+         # compute colors and sizes
+         cols = colscale[cut(sz, bks)]
+         sz = cexrange[1]+(cexrange[2]-cexrange[1])*(sz-bks[1])/c(range(bks) %*% c(-1,1))
+         # actual map
+         plot(loc, cex=sz, bg=cols, pch=21, asp=1, main=colnames(data)[k], col=foregroundcolor,
+             xaxt=ifelse(i==drow,"s","n"),yaxt=ifelse(j==1,"s","n")
+         )
+        } 
+       # add axes on extra sides
+       # if(i==1) axis(side=3)
+       if(j==dcol) axis(side=4)
+    }
+  }
+  # return the old graphical parameters
+  #par(mfrow=opar$mfrow, mar=opar$mar, oma=opar$oma)
+  invisible(opar)
+}
+
+
+
+#' Spectral colors palette
+#' based on the RColorBrewer::brewer.pal(11,"Spectral")
+#'
+#' @param n number of colors 
+#'
+#' @return a palette, i.e. a list of colors, from dark blue to dark red over pale yellow.
+#' @export
+#' @importFrom grDevices colorRampPalette
+#' @import RColorBrewer
+#'
+#' @examples
+#' (cls=spectralcolors(20))
+spectralcolors <- function(n){
+  cls = RColorBrewer::brewer.pal(min(11,n), "Spectral")
+  if(n>11){
+    cls = grDevices::colorRampPalette(cls)(n)
+  }
+  return(rev(cls))
+}
+
+# #### MOVED TO COMPOSITIONS #####
+# Compositional panel 1D-density plot
+# Panel minifunction for plotting histograms and kernel densities of the data 
+#
+# @param x ignored (here for compatibility with \code{\link{qqnorm.acomp}})
+# @param y numeric vector of response values
+# @param col color of the plot
+# @param ... further parameters to plotting functions, currently ignored
+# @param alpha alpha level at which the test should be done
+#
+# @return If given to the argument \code{panel} of a function such as \code{\link{qqnorm.acomp}}),
+# this produces a matrix of plots where each panel contains a histogram and a kernel density 
+# overdrawn. If the distribution of this data is accepted to be normal at the \code{alpha}-level
+# by a \code{\link{shapiro.test}}, then the histogram is painted with the \code{col}or provided; 
+# otherwise the histogram bars are empty, but the kernel density curve is colored.
+#
+# @examples
+# data("Windarling")
+# compo = as.matrix(Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")])
+# qqnorm.acomp(compo, panel=vp.lrdensityplot, alpha=0.05) 
+# #### MOVED TO COMPOSITIONS #####
+
+
+
+
+
+# Panel function for 2D-density plots
+# Panel minifunction for plotting 2D kernel densities of the data 
+#
+# @param x numeric vector of response values (axis X)
+# @param y numeric vector of response values (axis Y)
+# @param xaxt style of the X axis labelling (defaults to "n", none)
+# @param yaxt style of the Y axis labelling (defaults to "n", none)
+# @param grid logical, should a unit grid be drawn? defaults to TRUE
+# @param legpos string, position of the correlation coefficient, defaults to "bottomright"
+# @param add logical, should the output be added to an existing diagram? defaults to TRUE, as required for acting as a panel diagram
+# @param colpalette color palette for the image (defaults to spectral colors: blue-yellow-red)
+# @param ... further parameters to image
+#
+# @return If given to the argument \code{panel} of a function such as \code{\link{pairs}}),
+# this produces a matrix of images where each panel contains a 2D kernel density map,
+# using blue for low density regions and dark red for high density colors. 
+# Regression lines (y~x) and correlation coefficients are added as well.
+# @export
+#
+# @examples
+# data("Windarling")
+# compo = Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")]
+# pairs(clr(compo), panel=vp.kde2dplot)
+# vp.kde2dplot = 
+#   function(x, y, xaxt="n", yaxt="n",
+#            grid=TRUE,legpos="bottomright", add=TRUE, colpalette=spectralcolors,...){
+#     aux = MASS::kde2d(x, y, n=50)
+#     aux$z = sqrt(aux$z)
+#     bks = hist(aux$z, plot=F, breaks=20)$breaks
+#     cols = c(NA,colpalette(length(bks)-2))
+#     image(aux, breaks = bks, col=cols, xlab="", ylab="", xaxt=xaxt, yaxt=yaxt,add=add, ...) #yaxt=ifelse(j==1,"s","n")
+#     xgrid = seq(from=floor(min(x)), to = ceiling(max(x)), by=1)
+#     ygrid = seq(from=floor(min(y)), to = ceiling(max(y)), by=1)
+#     abline(lm(y~x), col=2, lwd=2)
+#     if(grid)abline(v=xgrid, h=ygrid, col="#999999")
+#     legend(legpos, legend=round(cor(x,y), dig=3), bg="#999999")
+#   }
+
+
+
+
+
+#' Swath plots
+#' 
+#' Plots of data vs. one spatial coordinate, with local average spline curve
+#'
+#' @param data data to be represented, compositional class, data.frame, or
+#' spatial data object (in which case, \code{loc} is a formula!)
+#' @param ... further arguments to panel plots
+#'
+#' @return Nothing, as the function is primarily called to produce a plot
+#' @export
+#' @importFrom stats loess
+#' @importFrom graphics plot par text lines mtext
+#'
+#' @examples
+#' \donttest{
+#' data("Windarling")
+#' library("sp")
+#' compo = compositions::acomp(Windarling[,c("Fe","Al2O3","SiO2", "Mn", "P")])
+#' Northing = Windarling$Northing
+#' swath(compo, Northing)
+#' wind.spdf = sp::SpatialPointsDataFrame(sp::SpatialPoints(Windarling[,6:7]), 
+#'      data=compo)
+#' swath(wind.spdf, loc=Northing)
+#' }
+swath <- function(data, ...) UseMethod("swath", data)
+
+
+
+#' @describeIn swath swath plot default method
+#' @export
+#' @method swath default
+#' @param loc a numeric vector with the values for one coordinate
+#' @param pch symbol to be used for the individual points, defaults to "."
+#' @param withLoess either logical (should a loess line be added?) or a list 
+#' of arguments to DescTools::lines.loess
+#' @param commonScale logical or NA: should all plots share the same vertical 
+#' range? FALSE=no, TRUE=yes (default), for
+#' compositional data sets, the value NA (=all plots within a row) is also 
+#' permitted and is actually the default
+#' @param xlab label for the common x axis (defaults to a deparsed version 
+#' of loc)
+#' @param mfrow distribution of the several plots; it has a good internal default 
+#' (not used for compositional 
+#' classes)
+swath.default <- function(data,  # data (matrix, rmult, aplus, rplus or data.frame)
+                           loc,   # X or Y coordinates (a vector)
+                           pch=".",
+                           withLoess=TRUE, # either logical (should a loess line be added?) or a list of arguments to lines.loess
+                           commonScale=TRUE, # logical: should all plots share the same vertical range? FALSE=no, TRUE=yes
+                           xlab = deparse(substitute(loc)),
+                           ..., # extra arguments to plot
+                           mfrow # automatic distribution of figs
+){
+  # set of swath plots for each possible pwlr, eventually with a loess line 
+  if(missing(mfrow)) mfrow = c(floor(sqrt(ncol(data))), ceiling(ncol(data)/floor(sqrt(ncol(data)))))
+  if(is(data, "Spatial")) return(swath_SpatialPointsDataFrame(data=data, loc=loc, pch=pch, xlab=xlab, mfrow=mfrow,
+                                                    withLoess=withLoess, commonScale=commonScale, ...))
+  # preparations
+  opar = par()
+  opar = par_remove_readonly(opar)
+  
+  on.exit(par(opar))
+  col0 = spectralcolors(10)[10]
+  
+  # isometric representation
+  if(is(data, "data.frame")) data = compositions::rmult(as.matrix(data), V=gsi.getV(data), orig=gsi.orig(data))
+  comp = idt(data)
+  
+  # dimensions
+  D = ncol(comp)
+  N = length(loc)
+  
+  # scale the variables
+  if(is.na(commonScale)){
+    rgY = rep(range(unclass(comp), na.rm=TRUE), D)
+    dim(rgY) = c(2,D)
+  }else{
+    rgY = rep(NA, 2*D) 
+    dim(rgY) = c(2,D)
+  }
+  
+  # do the plot
+  par(mfrow=mfrow, mar=c(1,1,3,1)/5, oma=c(5,5,5,3))
+  for(k in 1:D){
+    ij = par()$mfg
+    i = ij[1]
+    j = ij[2]
+    plot(loc, comp[,k], ylim = range(c(comp[,k], rgY[,i]), na.rm=TRUE), pch = pch,
+         xaxt = ifelse(i==D,"s","n"), yaxt = ifelse(j==1,"s","n"),...)
+    title(main=colnames(comp)[k], line=0.5)
+    if(is.logical(withLoess)){
+      if(withLoess & requireNamespace("DescTools")){
+        a = stats::loess(comp[,k]~loc)
+        DescTools::lines.loess(a, col=col0)
+      }
+    }else if(is(withLoess, "list") & requireNamespace("DescTools")){
+      args = withLoess
+      if(!("col" %in% names(args))) args$col = col0
+      args$x = stats::loess(comp[,k]~loc)
+      do.call("DescTools::lines.loess", args=args)
+    }
+    # add axes on extra sides
+    if(i==1) axis(side=3)
+    if(j==D) axis(side=4)
+  }
+  # add labels
+  mtext(xlab, side=1, outer=TRUE, line=2.5, cex=1.25)
+  # return the old graphical parameters to freeze the plot
+  #invisible(opar)
+}
+
+
+#' @describeIn swath  Swath plots for acomp objects
+#' @method swath acomp 
+#' @export
+#' @importFrom compositions acomp
+swath.acomp <- function(data,  # composition (rcomp, acomp, ccomp)
+           loc,   # X or Y coordinates (a vector)
+           pch = ".",
+           withLoess = TRUE, # either logical (should a loess line be added?) or a list of arguments to lines.loess
+           commonScale = NA, # logical or NA:
+           xlab = deparse(substitute(loc)),
+           ... # extra arguments to plot
+  ){
+    # recover byRowCommonScale from commonScale
+    byRowCommonScale = ifelse(is.na(commonScale), TRUE, ifelse(commonScale, NA, FALSE))
+    # set of swath plots for each possible pwlr, eventually with a loess line 
+
+    # preparations
+    opar = par()
+    opar = par_remove_readonly(opar)
+    
+    on.exit(par(opar))
+    col0 = spectralcolors(10)[10]
+    comp = data
+    
+    # dimensions
+    D = ncol(comp)
+    N = length(loc)
+
+    # compute pwlrs
+    ij = expand.grid(i=1:D, j=1:D) # pairs of indices
+    mat = matrix(0, nrow=D, ncol=D*D)   # matrix of +1 numerator, -1 denominator
+    for(k in 1:nrow(ij)){
+      mat[ij[k,1],k]=-1
+      mat[ij[k,2],k]=1
+      if(ij[k,1]==ij[k,2]) mat[ij[k,1],k]=0
+    }
+    Zpwlr = unclass(as.matrix(cdt(comp))) %*% mat
+    
+    # scale the variables
+    if(is.na(byRowCommonScale)){
+      rgY = rep(range(Zpwlr, na.rm=TRUE), 2*D)
+      dim(rgY) = c(2,2*D)
+    }else{
+      if(byRowCommonScale){
+        rgY = sapply(1:D, function(i)  range(Zpwlr[,(D*(i-1)+1:D)[-i]], na.rm=TRUE))
+      }else{
+        rgY = rep(NA, 2*D)
+        dim(rgY) = c(2,D)
+      }
+    }
+    
+    # do the plot
+    par(mfrow=c(D,D), mar=c(1,1,1,1)/5, oma=c(5,5,5,3))
+    for(i in 1:D){
+      for(j in 1:D){
+        k = (i-1)*D+j        
+        if(i==j){
+          # diagonal plots, only labels
+          plot(loc, Zpwlr[,k], type="n", xaxt=ifelse(i==D,"s","n"), yaxt=ifelse(j==1,"s","n") )
+          text(mean(range(loc, na.rm=TRUE)), 0, colnames(comp)[i], cex=2)
+        }else{
+          # off-diagonal plots, swaths  
+          plot(loc, Zpwlr[,k], ylim = range(c(Zpwlr[,k], rgY[,i]), na.rm=TRUE), pch=pch,
+               xaxt=ifelse(i==D,"s","n"), yaxt=ifelse(j==1,"s","n"), ...)
+          if(is.logical(withLoess)){
+            if(withLoess){
+              a = stats::loess(Zpwlr[,k]~loc)
+              DescTools::lines.loess(a, col=col0)
+            }
+          }else if(is(withLoess, "list")){
+            args = withLoess
+            if(!("col" %in% names(args))) args$col = col0
+            args$x = stats::loess(Zpwlr[,k]~loc)
+            do.call("DescTools::lines.swath", args=args)
+          }
+        }
+        # add axes on extra sides
+        if(i==1) axis(side=3)
+        if(j==D) axis(side=4)
+      }
+    }
+    # add labels
+    label1 = c("+ part", "numerator") [1+is.acomp(data)]
+    label2 = c("- part", "denominator") [1+is.acomp(data)]
+    mtext(label1, side=2, outer=TRUE, line=2.5, cex=1.25)
+    mtext(label2, side=3, outer=TRUE, line=2.5, cex=1.25)
+    mtext(xlab, side=1, outer=TRUE, line=2.5, cex=1.25)
+    # return the old graphical parameters to freeze the plot
+    #invisible(opar)
+}
+
+#' @describeIn swath Swath plots for ccomp objects
+#' @method swath ccomp
+#' @export
+#' @importFrom compositions ccomp
+swath.ccomp <- swath.acomp
+
+#' @describeIn swath Swath plots for rcomp objects
+#' @method swath rcomp 
+#' @export
+#' @importFrom compositions rcomp
+swath.rcomp <- swath.acomp
+
+
+
+
+swath_SpatialPointsDataFrame <- function(data, loc, ...){
+  swath(data@data, loc=loc, ...)
+}
+
+
diff --git a/R/genDiag.R b/R/genDiag.R
index dd2b51634675054f6e9652bb94ecad074ed2a059..6235982c1a3b31961f54c6f59638ca9355a8d54e 100644
--- a/R/genDiag.R
+++ b/R/genDiag.R
@@ -131,7 +131,7 @@ Maf = function(x,...) UseMethod("Maf",x)
 #' spatial imaging, Stanford University, Palo Alto, USA, 14pp.
 #' 
 #' Tichavsky, P. and Yeredor, A., 2009. Fast approximate joint diagonalization 
-#' incorporating weight matrices. IEEE Transactions on Signal Processing 57, 878 – 891.
+#' incorporating weight matrices. IEEE Transactions on Signal Processing 57, 878 ??? 891.
 #' 
 #' @family generalised Diagonalisations
 #'
@@ -192,7 +192,7 @@ Maf.data.frame <- function(x, vg, i=2,...){
   res = list(sdev=sdevs, loadings = W, center=mn, scale=rep(1, ncol(x)),
              n.obs=nrow(x), scores = facscores, C1=C1, C2=C2, eigenvalues = eigenvalues[ord], 
              invLoadings=Winv)
-  if(class(x)!="data.frame"){
+  if(!is(x,"data.frame")){
     res$Center = cdtInv(mn)
     res$InvLoadings = cdtInv(Winv, orig=x)
     res$InvDownLoadings = cdtInv(-Winv, orig=x)
@@ -447,7 +447,7 @@ predict.genDiag = function (object, newdata=NULL, ...) {
   requireNamespace("compositions", quietly=TRUE)
   if(is.null(newdata))
     newdata = object$scores
-  if("data.frame" %in% class(newdata))
+  if(is(newdata, "data.frame"))
     newdata = as.matrix(newdata)
   Z = newdata %*% unclass(object$invLoadings)
   if("Center" %in% names(object)){
@@ -478,9 +478,9 @@ predict.genDiag = function (object, newdata=NULL, ...) {
 #' @importFrom compositions coloredBiplot
 #' @method coloredBiplot genDiag
 #' 
-#' @references Mueller, Tolosana-Delgado, Grunsky and McKinley (2021) Biplots for 
-#' Compositional Data Derived from Generalised Joint Diagonalization Methods. 
-#' Applied Computational Geosciences (under review)
+#' @references Mueller, Tolosana-Delgado, Grunsky and McKinley (2020) Biplots for 
+#' compositional data derived from generalised joint diagonalization methods. 
+#' Applied Computational Geosciences 8:100044 \doi{10.1016/j.acags.2020.100044}
 #'
 #' @examples
 #' data("jura", package="gstat")
diff --git a/R/geostats.R b/R/geostats.R
index 481d3330e5ebc2c7ae1b2d89d006672203b60468..72e5e0bbb5e7507d5990597948a9095cbad52de2 100644
--- a/R/geostats.R
+++ b/R/geostats.R
@@ -50,7 +50,7 @@ gsi.calcCgram <- function(X,Y,vgram,ijEqual=FALSE) {
   nY<-nrow(Y)
   k <- length(vgram$type)
   dimC = c(d,nX,d,nY)
-  erg<- .C("CcalcCgram",
+  erg<- .C(CcalcCgram,
            dimX=checkInt(dim(X),2),
            ldX=checkInt(dim(X)[1],1),
            X=checkDouble(X),
@@ -70,8 +70,9 @@ gsi.calcCgram <- function(X,Y,vgram,ijEqual=FALSE) {
   structure(erg$C,dim=c(d*nX,d*nY))
 }
 
+#' @useDynLib gmGeostats getUnitvecR
 gsiGetUnitVec <- function(dimX,ip) {
-    erg = .C("getUnitvecR",dimX=checkInt(dimX,1),ip=checkInt(ip,1),unitvec=numeric(dimX))
+    erg = .C(getUnitvecR,dimX=checkInt(dimX,1),ip=checkInt(ip,1),unitvec=numeric(dimX))
     unitvec = erg$unitvec
     unitvec
     }
@@ -119,7 +120,7 @@ gsi.TurningBands <- function(X,vgram,nBands,nsim=NULL) {
   }
   # run!
   if( is.null(nsim) ) {
-    erg<-.C("CMVTurningBands",
+    erg<-.C(CMVTurningBands,
             dimX=checkInt(dim(X),2),
             X=checkDouble(X,prod(dim(X))),
             dimZ=checkInt(c(d,nrow(X),1),3),
@@ -137,7 +138,7 @@ gsi.TurningBands <- function(X,vgram,nBands,nsim=NULL) {
   } else {
     nsim <- checkInt(nsim,1)
     stopifnot(nsim>0)
-    erg<-.C("CMVTurningBands",
+    erg<-.C(CMVTurningBands,
             dimX=checkInt(dim(X),2),
             X=checkDouble(X,prod(dim(X))),
             dimZ=checkInt(c(d,nrow(X),nsim),3),
@@ -209,7 +210,7 @@ gsi.CondTurningBands <- function(Xout, Xin, Zin, vgram,
     vgram$M = M
     m = 3
   }
-  erg <- .C("CCondSim",
+  erg <- .C(CCondSim,
             dimZin=checkInt(dimZin,2),
             Zin   =checkDouble(Zin),
             Cinv  =checkDouble(Cinv,length(Zin)^2),
diff --git a/R/gmValidationStrategy.R b/R/gmValidationStrategy.R
index 2db7e5548b228424baf10a2d05892810b21f2a6e..a00fffc493025b5b0de68720c520954349f6bc08 100644
--- a/R/gmValidationStrategy.R
+++ b/R/gmValidationStrategy.R
@@ -81,7 +81,7 @@ validate <- function(object, strategy, ...) UseMethod("validate", strategy)
 #' @method validate LeaveOneOut
 #' @export
 validate.LeaveOneOut = function(object, strategy, ...){
-  if("gstat" %in% class(object)){
+  if(inherits(object,what="gstat",which=FALSE)){
     n = nrow(object$data[[1]]$data)
   }else if(is(object, "gmSpatialModel")){
     n = nrow(object@data)
@@ -99,7 +99,7 @@ validate.LeaveOneOut = function(object, strategy, ...){
 #' @export
 validate.NfoldCrossValidation = function(object, strategy, ...){
   # manage "gstat" case
-  if("gstat" %in% class(object)){
+  if(is(object,"gstat")){
     warning("validate: object provided is of class 'gstat', attempting 'gstat.cv(..., remove.all=TRUE, all.residuals=TRUE)'")
     return(gstat::gstat.cv(object, nfold=strategy$nfolds, remove.all = TRUE, all.residuals = TRUE))
   }
diff --git a/R/gstatCompatibility.R b/R/gstatCompatibility.R
index cb9b9fcc41e726dfb3cdb0e88b2416c809b6bfab..90d8ea913b123298b99f1f4d3be57bd6d66a083f 100644
--- a/R/gstatCompatibility.R
+++ b/R/gstatCompatibility.R
@@ -246,13 +246,13 @@ variogramModelPlot.gstatVariogram =
       model = lapply(noms, function(i) gstat::vgm(model="Sph", psill=0, range=1, nugget=0))
       names(model)=noms
     }else{
-      if("variogramModelList" %in% class(model)){
+      if(is(model, "variogramModelList")){
         vrnames = names(model)
         vrnames = vrnames[-grep(".", vrnames, fixed=TRUE)]
-      }else if("gstat" %in% class(model)){
+      }else if(is(model, "gstat")){
         vrnames = names(model$data)
         model = model$model
-      }else if("variogramModel" %in% class(model) ){
+      }else if(is(model,"variogramModel") ){
         vrnames = levels(model$id)[1]
       }else{
         ggtr = tryCatch(as.variogramModel(model))
diff --git a/R/mask.R b/R/mask.R
index f477b0012cf8a3572db388c7b4889a4bb3899d70..53a1b3886cc61f1f4b3c1816f38c9bcf86c7ab0a 100644
--- a/R/mask.R
+++ b/R/mask.R
@@ -377,7 +377,7 @@ unmask <- function(x,...) UseMethod("unmask", x)
 #' a "unmask.DataFrameStack" produces a "unmask.DataFrameStack"; 
 #' a "SpatialPoints" produces a "SpatialPoints"; and finally
 #' a "SpatialPixels" produces either a "SpatialPixels" or a "SpatialGrid" (if it is full).
-#' Note that only in the case that `class(x)=="SpatialPixels"` is `mask` required,
+#' Note that only in the case that `is(x,"SpatialPixels")=TRUE` is `mask` required,
 #' for the other methods all arguments have reasonable defaults.
 #' @export
 unmask.data.frame <- function(x, mask=attr(x,"mask"), fullgrid = attr(mask, "fullgrid"), 
diff --git a/R/variograms.R b/R/variograms.R
index 1c0c9dc63b66093915fb508edc96fa68445de59d..21b6b5456d3832987cc6d6211ef90347dec05ff4 100644
--- a/R/variograms.R
+++ b/R/variograms.R
@@ -132,7 +132,7 @@ setCgram = function(type, nugget=sill*0, sill, anisRanges, extraPar=0){
 #' vm = v1+v2
 "+.gmCgram" <- function(x,y) {
   y = as.gmCgram(y)
-  stopifnot(class(y)=="gmCgram",
+  stopifnot(is(y,"gmCgram"),
             dim(x$sill)[-1]==dim(y$sill)[-1],
             dim(x$M)[-1]==dim(y$M)[-1])
   myfun = function(A,B){
diff --git a/cran-comments.md b/cran-comments.md
index 27052c5ddd2b1fcf67de163dccdb0f096285bf61..511b6e8f31da6c55f2f3e81673f762255373a19e 100644
--- a/cran-comments.md
+++ b/cran-comments.md
@@ -2,7 +2,10 @@
   
 * Linux: Ubuntu 18.04.4LTS  (local)
   - R 3.6.3 (production)
-  - R devel 2022-04-23 r82240 (test)
+  - R 4.2.3 patched (2023-03-29 r84146) (test)
+  - R devel (2023-04-02 r84146) (test)
+* Windows 7 Enterprise
+  - R version 4.2.2 (2022-10-31 ucrt) (test)
 
   
 ## R CMD check results
@@ -11,22 +14,26 @@
   
   
 ## This is a resubmission
-  
+
+Request from 2022-11-21
+
 ```
-Dear maintainer,
+libstdc++ is the support library for g++, but not for other C++ compilers.  It may not exist on other systems, may be inappropriate (as on fedora-clang) or may be faked (as on macOS).
+
+It should never be necessary as R links C++ code with the C++ compiler.
 
-Please see the problems shown on
-<https://www.stats.ox.ac.uk/pub/bdr/BLAS/gmGeostats.out>
+Please correct ASAP for
 
-For reproduction details see
-<https://www.stats.ox.ac.uk/pub/bdr/BLAS/README.txt> .
+edlibR gmGeostats numbat sass scistreer
 
-Please correct before 2022-05-05 to safely retain your package on CRAN.
+and before 2022-12-05 to safely retain the package on CRAN.
 
-The CRAN Team
+--
+Brian D. Ripley,                  ripley@stats.ox.ac.uk
+Emeritus Professor of Applied Statistics, University of Oxford
 ```
   
-... related to the way classes were assessed (now using `inherits(...)`) and the calls of FORTRAN routines from C code that involve string arguments (now following the instructions given in <https://www.stats.ox.ac.uk/pub/bdr/BLAS/README.txt> ).
+The flag mentioned has been removed as requested.
   
 
 best regards
\ No newline at end of file
diff --git a/src/Makevars b/src/Makevars
index 534b476314013d5df4b5283ae63af9db2a48f181..159b57ee7920a697cddd8d84f180efcf4f81ce4d 100644
--- a/src/Makevars
+++ b/src/Makevars
@@ -1,5 +1,5 @@
 PKG_CFLAGS = $(SHLIB_OPENMP_CFLAGS)
-PKG_LIBS =  $(FLIBS) -lstdc++ $(SHLIB_OPENMP_CFLAGS)
+PKG_LIBS =  $(FLIBS) $(SHLIB_OPENMP_CFLAGS)
 
 C_OBJS = gmGeostats.o
 
diff --git a/src/Makevars.win b/src/Makevars.win
index 2e8e29d4ad7f7e7ccc5cd00eaad156081c573d50..a1a3c30ee706911ee3aa7b759870ec49377351e2 100644
--- a/src/Makevars.win
+++ b/src/Makevars.win
@@ -1,5 +1,5 @@
 PKG_CFLAGS = $(SHLIB_OPENMP_CFLAGS)
-PKG_LIBS =  $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS) -lstdc++ $(SHLIB_OPENMP_CFLAGS)
+PKG_LIBS =  $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS) $(SHLIB_OPENMP_CFLAGS)
 
 C_OBJS = gmGeostats.o
 
diff --git a/src/gmGeostats.c b/src/gmGeostats.c
index 5d6b7200225cd7c329b2f122b3455921e9cf7bb3..e5dd1ea40e60420fb8917b9d5ab074d95018b893 100644
--- a/src/gmGeostats.c
+++ b/src/gmGeostats.c
@@ -218,8 +218,14 @@ const double *extra /* extra parameter (unused), for consistency with other cova
    * where projs were calculated. Evaluate a wave of random phase at the
    * projs points. Return that wave */	
   int i;
-  double phase,amp,omega,d1;   
-  omega = norm_rand()  * M_SQRT2 / range * M_SQRT_3; /* sqrt(3) needed to produce effective range*/
+  double phase,amp,omega,d1,o1,o2,o3;   
+  /* omega = norm_rand()  * M_SQRT2 / range * M_SQRT_3; /* sqrt(3) needed to produce effective range*/
+  // chi(3) distributed radial frequency, with random sign
+  o1 = norm_rand();
+  o2 = norm_rand();
+  o3 = norm_rand();
+  omega = sqrt(o1*o1 + o2*o2 + o3*o3) * sign(o1);
+  omega = omega * M_SQRT2 / range * M_SQRT_3; /* sqrt(3) needed to produce effective range*/
   phase = unif_rand() * M_2PI;
   amp  = M_SQRT2; /* Lantuejoul (2002), page 191 */
   for(i=0;i<n;i++) {
diff --git a/vignettes/register_new_layer_datatype.Rmd b/vignettes/register_new_layer_datatype.Rmd
index 13d008b2eaffde966f772383f9a2a021c40da3bd..7e0dda23c90c432dc1d1396d3a49b3f290168986 100644
--- a/vignettes/register_new_layer_datatype.Rmd
+++ b/vignettes/register_new_layer_datatype.Rmd
@@ -236,13 +236,13 @@ theta.vg %>% recompute_complex_cov %>% plot
 ```
 This is again a quick and dirty solution, as:
 
-1. by using the plotting capacities of "gstat" for an incompplete object we waste half the space of the plot;
+1. by using the plotting capacities of "gstat" for an incomplete object we waste half the space of the plot;
 2. actually the current function can only deal with symmetric covariances (because of the way that gstat::variogram estimates the covariance), and
 3. hence the imaginary part is identically zero.
 
 ### Non-symmetric covariance
 
-Hence, we need to improve the computations slightly. First we need to consider estimates for the whole 360$^o$, so that we can obtain $C_{VU}(h)=C_{UV}(-h)$, 
+We need to improve the computations slightly. First we need to consider estimates for the whole 360$^o$, so that we can obtain $C_{VU}(h)=C_{UV}(-h)$, 
 ```{r}
 # compute variogram for the whole circle, i.e. until 360 deg
 theta.cvg = gmGeostats::variogram(
@@ -342,10 +342,10 @@ Boogaart, K. G. v. d.; Tolosana-Delgado, R. (2013) _Analysing compositional data
 Boogaart, K.G. v.d.; Tolosana-Delgado, Raimon; Bren, Matevz (2021). compositions: Compositional Data Analysis. R package version 2.0-2. http://www.stat.boogaart.de/compositions/
 
 
-De Iaco, S.; Posa, D.; Palma, M. (2013) Complex-Valued Random Fields for Vectorial Data: Estimating and Modeling Aspects. _Mathematical Geosciences_ 45: 557–573
+De Iaco, S.; Posa, D.; Palma, M. (2013) Complex-Valued Random Fields for Vectorial Data: Estimating and Modeling Aspects. _Mathematical Geosciences_ 45: 557???573
 
 
 Stevens, S. (1946) On the theory of scales of measurement. _Science_ 103: 677-680
 
 
-Wackernagel, H. (2003) _Multivariate geostatistics—an introduction with applications_, 3rd edn. Springer, Berlin 
+Wackernagel, H. (2003) _Multivariate geostatistics???an introduction with applications_, 3rd edn. Springer, Berlin