So I have been using R Markdown for quite some time, but now when I try to knit my document, it fails. The following error message comes up:
LaTeX Error: Lonely \item--perhaps a missing list environment.
I am not sure as to why this happens since it was working for me before. I have two shortcuts in the preamble of my document:
\newcommand{\benum}{\begin{enumerate}}
\newcommand{\eenum}{\end{enumerate}}
I have a feeling that this could be the cause of my problems, but it is frustrating as I have used these for such a long time before this with no trouble.
Any help would be appreciated!
EDIT:
Here is a minimal document that I made. This small document will not knit, and the same error message described above comes up.
---
title: "Minimal Document"
author: Aiden Kenny
date: Friday, 09/21/2018
header-includes:
- #\usepackage{setspace}\doublespacing
- \newcommand{\benum}{\begin{enumerate}}
- \newcommand{\eenum}{\end{enumerate}}
- \usepackage{xcolor}
fontsize: 12pt
geometry: margin=1in
output: pdf_document
---
\newpage
```{r setup, include = FALSE}
knitr::opts_chunk$set(fig.width = 10, fig.height = 5, echo = TRUE)
library(mosaic)
library(knitr)
library(scatterplot3d)
```
1. Here is a sample of some code I found online. The code chunk by
itself will run fine, so that is not the issue.
```{r, echp=FALSE}
require(stats); require(graphics)
plot(cars, xlab = "Speed (mph)", ylab = "Stopping distance (ft)",
las = 1)
lines(lowess(cars$speed, cars$dist, f = 2/3, iter = 3), col = "red")
title(main = "cars data")
plot(cars, xlab = "Speed (mph)", ylab = "Stopping distance (ft)",
las = 1, log = "xy")
title(main = "cars data (logarithmic scales)")
lines(lowess(cars$speed, cars$dist, f = 2/3, iter = 3), col = "red")
summary(fm1 <- lm(log(dist) ~ log(speed), data = cars))
opar <- par(mfrow = c(2, 2), oma = c(0, 0, 1.1, 0),
mar = c(4.1, 4.1, 2.1, 1.1))
plot(fm1)
par(opar)
```
EDIT: I have been playing around with this, and this seems to be part of the issue:
\begin{enumerate}
\item Using the default enumerate/itemize commands
\item DO work!
\end{enumerate}
but...
\benum
\item Using the shortcut commands I made
\item DO NOT work!
\eenum
When I try to run the code chunks individually, they compile just fine and produce the desired graphs, so I suspect that this is an issue with LaTeX. But I am not a tex person so I am not sure.
The reason I had initially made these shortcut command was so that I could use chunks of R code in this type of environment (see https://tex.stackexchange.com/questions/210003/how-can-i-nest-a-code-chunk-within-an-enumerate-environment-when-using-r-markdow).
As mentioned by someone below, this could have to do with something called Pandoc?
Thanks!
There is no need for raw LaTeX. The following knits fine and produces the result that I think you want to achieve:
---
title: "Minimal Document"
author: Aiden Kenny
date: Friday, 09/21/2018
header-includes:
- \usepackage{xcolor}
fontsize: 12pt
geometry: margin=1in
output:
pdf_document:
keep_tex: yes
---
\newpage
```{r setup, include = FALSE}
knitr::opts_chunk$set(fig.width = 10, fig.height = 5, echo = TRUE)
library(knitr)
```
1. Here is a sample of some code I found online. The code chunk by
itself will run fine, so that is not the issue.
```{r}
require(stats); require(graphics)
plot(cars, xlab = "Speed (mph)", ylab = "Stopping distance (ft)",
las = 1)
lines(lowess(cars$speed, cars$dist, f = 2/3, iter = 3), col = "red")
title(main = "cars data")
plot(cars, xlab = "Speed (mph)", ylab = "Stopping distance (ft)",
las = 1, log = "xy")
title(main = "cars data (logarithmic scales)")
lines(lowess(cars$speed, cars$dist, f = 2/3, iter = 3), col = "red")
summary(fm1 <- lm(log(dist) ~ log(speed), data = cars))
opar <- par(mfrow = c(2, 2), oma = c(0, 0, 1.1, 0),
mar = c(4.1, 4.1, 2.1, 1.1))
plot(fm1)
par(opar)
```
2. Thanks for the help. This document will not knit!
As an alternative you could also use the full commands: \begin{enumerate}
and \end{enumerate}
. I am not sure why your abbreviations are getting stripped from the TeX file by pandoc
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With