I use OSX Yosemite with XQuartz as was suggested in other questions, and I've been attempting to publish a notebook but get the same error every time. This is what the .R file looks like:
#' ---
#' title: "MLB Payroll Analysis"
#' author: "Steven Quartz Universe"
#' date: "21 March 2015"
#' output: pdf_document
#' ---
#loading the payroll data from the Python document
payroll <- read.table("~/Documents/payroll.txt", header=TRUE, quote="\"")
View(payroll)
summary(payroll)
bank <- payroll$PayrollMillions
wins <- payroll$X2014Wins
#loading the payroll data from the Python document
payroll <- read.table("~/Documents/payroll.txt", header=TRUE, quote="\"")
summary(payroll)
bank <- payroll$PayrollMillions
wins <- payroll$X2014Wins
#displaying the mean and sd of payroll and wins (out of 162, of course)
mean(bank)
sd(bank)
mean(wins)
sd(wins)
#setting a linear regression
reg <- lm(wins ~ bank)
summary(reg)
#the regression is valid to significance < .10 (p-value .05072),
#but the R-squared is only .1296, a weak correlation
#a means of comparing the histogram to a normal distribution
histNorm <- function(x, densCol = "darkblue"){
m <- mean(x)
std <- sqrt(var(x))
h <- max(hist(x,plot=FALSE)$density)
d <- dnorm(x, mean=m, sd=std)
maxY <- max(h,d)
hist(x, prob=TRUE,
xlab="x", ylim=c(0, maxY),
main="(Probability) Histogram with Normal Density")
curve(dnorm(x, mean=m, sd=std),
col=densCol, lwd=2, add=TRUE)
}
#showing the histogram with normal distribution line
histNorm(reg$residuals, "purple")
#QQplots and Shapiro-Wilk test
qqnorm(reg$residuals)
qqline(reg$residuals)
shapiro.test(reg$residuals)
#p-value is .383; this can be considered a normal distribution
plot(reg$fitted.values,reg$residuals)
abline(h = 0)
#variances are wide, but in a channel
install.packages("lmtest")
library(lmtest)
bptest(reg)
#p-value of .849 given; we can assume variances are constant throughout the distribution
hats <- hatvalues(reg)
hatmu <- mean(hats)
hats[hats > 2 * hatmu]
#we get teams 14 and 19 with high leverage; the Dodgers and Yankees with their astronomical payrolls
treg <- rstudent(reg)
n <- length(treg)
p <- reg$coefficients
df <- n - p - 1
alpha <- 0.05
#no bonferroni correction for outliers
crit <- qt(1 - alpha/2,df)
treg[abs(treg) > crit]
#no outliers are found
#with bonferroni correction
crit <- qt(1 - (alpha/2)/n,df)
treg[abs(treg) > crit]
#no outliers are found
#comparison of outlier tests
pvals <- pt(-abs(treg),df)*2
padjb <- p.adjust(pvals, method = "bonferroni")
padjf <- p.adjust(pvals, method = "fdr")
cbind(pvals,padjb,padjf)
When I hit Compile Notebook, this is the output:
|...................... | 33%
ordinary text without R code
|........................................... | 67%
label: unnamed-chunk-1
processing file: payroll.spin.Rmd
Quitting from lines 9-90 (payroll.spin.Rmd)
Error in contrib.url(repos, "source") :
trying to use CRAN without setting a mirror
Calls: <Anonymous> ... withVisible -> eval -> eval -> install.packages -> contrib.url
I've looked through other questions on how to rectify this, but to no avail. I've done the command line fixes, again to no avail. Could someone point me as to what I'm doing wrong? Thanks kindly.
The line
install.packages("lmtest")
is the problem here. As is hinted by the error message
Error in contrib.url(repos, "source") :
trying to use CRAN without setting a mirror
it is expected that you provide a link to a repo for the package. So changing it to (for instance):
install.packages("lmtest", repos = "http://cran.us.r-project.org")
should do the trick. But as MrFlick and Ben Bolkers pointed out in their comments, it should probably be done when the package is not already installed.
I had this same issue with a Knit HTML publish, i modified the very beginning of the file like so:
---
title: "dialectic"
author: "micah smith"
date: "3/4/2017"
output: html_document
---
```{r setup, include=FALSE}
chooseCRANmirror(graphics=FALSE, ind=1)
knitr::opts_chunk$set(echo = TRUE)
the chooseCRANmirror(graphics=FALSE, ind=1)
was the line that fixed it
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With