Has anyone put together/found a good method for listing all the S3 methods available for a given object? The built-in methods()
function will give all available methods for a specified class, or for a specified generic function, but not for an object.
The example I have in mind is a glm
object, which is of (minor?) class "glm"
but also inherits from "lm"
g <- glm(y~x,data=data.frame(x=1:10,y=1:10))
class(g)
## [1] "glm" "lm"
There are 35 methods for class "lm" and 22 for "glm". I'm interested in an answer that combines the results of
lapply(class(g),function(x) methods(class=x))
in a sensible way, so that I can immediately see (for example) that there is a glm
-specific method for add1
, but that the method for alias
is inherited from the lm
class.
Does someone have a slick way to do this, or does it already exist?
PS Steve Walker's S3-S4-reference class glossary shows that this works automatically for reference classes, where we have to use an object to get the methods (x$getRefClass()$methods()
).
The ListObjectsV2() will always return up to 1000 objects alphabetically in the requested Prefix.
In the Buckets list, choose the name of the bucket that you want to create a bucket policy for or whose bucket policy you want to edit. Choose Permissions. Under Bucket policy, choose Edit. This opens the Edit bucket policy page.
There are five different operations you can perform with S3 Batch: PUT copy object (for copying objects into a new bucket) PUT object tagging (for adding tags to an object) PUT object ACL (for changing the access control list permissions on an object)
PDF. Retrieves objects from Amazon S3. To use GET , you must have READ access to the object. If you grant READ access to the anonymous user, you can return the object without using an authorization header.
Here's an attempt to replicate the "standard" behavior
classMethods <- function(cl) {
if(!is.character(cl)) {
cl<-class(cl)
}
ml<-lapply(cl, function(x) {
sname <- gsub("([.[])", "\\\\\\1", paste0(".", x, "$"))
m <- methods(class=x)
data.frame(
m=as.vector(m),
c=x, n=sub(sname, "", as.vector(m)),
attr(m,"info"),
stringsAsFactors=F
)
})
df<-do.call(rbind, ml)
df<-df[!duplicated(df$n),]
structure(df$m,
info=data.frame(visible=df$visible, from=df$from),
class="MethodsFunction")
}
And then you can try it out with
g <- glm(y~x,data=data.frame(x=1:10,y=1:10))
classMethods(g)
#or classMethods(c("glm","lm"))
and that will return
[1] add1.glm* anova.glm confint.glm* cooks.distance.glm*
[5] deviance.glm* drop1.glm* effects.glm* extractAIC.glm*
[9] family.glm* formula.glm* influence.glm* logLik.glm*
[13] model.frame.glm nobs.glm* predict.glm print.glm
[17] residuals.glm rstandard.glm rstudent.glm summary.glm
[21] vcov.glm* weights.glm* alias.lm* case.names.lm*
[25] dfbeta.lm* dfbetas.lm* dummy.coef.lm* hatvalues.lm
[29] kappa.lm labels.lm* model.matrix.lm plot.lm
[33] proj.lm* qr.lm* simulate.lm* variable.names.lm*
Non-visible functions are asterisked
It's not as elegant or short as Josh's, but I think its a good recreation of the default behavior. It's funny to see that the methods
function is itself mostly just a grep across all known function names. I borrowed the gsub
stuff from there.
Here's a function that will at least tell you which S3 methods an object will initially trigger:
findMethodsS3 <- function(object) {
x <- unlist(lapply(class(object),function(x) methods(class=x)))
sort(x[!duplicated(tools::file_path_sans_ext(x))])
}
findMethodsS3(g)
# [1] "add1.glm" "alias.lm" "anova.glm"
# [4] "case.names.lm" "confint.glm" "cooks.distance.glm"
# [7] "deviance.glm" "dfbeta.lm" "dfbetas.lm"
# [10] "drop1.glm" "dummy.coef.lm" "effects.glm"
# [13] "extractAIC.glm" "family.glm" "formula.glm"
# [16] "hatvalues.lm" "influence.glm" "kappa.lm"
# [19] "labels.lm" "logLik.glm" "model.frame.glm"
# [22] "model.matrix.lm" "nobs.glm" "plot.lm"
# [25] "predict.glm" "print.glm" "proj.lm"
# [28] "qr.lm" "residuals.glm" "rstandard.glm"
# [31] "rstudent.glm" "simulate.lm" "summary.glm"
# [34] "variable.names.lm" "vcov.glm" "weights.glm"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With