Skip to content
Snippets Groups Projects
Commit a6db3391 authored by B.Anderson's avatar B.Anderson
Browse files

amended to enable a way to force report update; pdf still fails, unclear why;...

amended to enable a way to force report update; pdf still fails, unclear why; render is still ignoring floating ToC. Have seen this before, can't remember the cause/fix
parent c4ff5901
No related branches found
No related tags found
3 merge requests!3merge a few edits,!2fixed pdf build,!1Re run ellis full data
This commit is part of merge request !3. Comments created here will be created in the context of that merge request.
...@@ -9,7 +9,7 @@ author: '`r params$authors`' ...@@ -9,7 +9,7 @@ author: '`r params$authors`'
date: 'Last run at: `r Sys.time()`' date: 'Last run at: `r Sys.time()`'
output: output:
bookdown::html_document2: bookdown::html_document2:
self_contained: TRUE self_contained: no
fig_caption: yes fig_caption: yes
code_folding: hide code_folding: hide
number_sections: yes number_sections: yes
...@@ -305,13 +305,17 @@ aggDT[, propExpected := sumOK/(uniqueN(feederDT$feeder_ID)*24*4)] # we expect 25 ...@@ -305,13 +305,17 @@ aggDT[, propExpected := sumOK/(uniqueN(feederDT$feeder_ID)*24*4)] # we expect 25
summary(aggDT) summary(aggDT)
message("How many days have 100%?") message("How many days have 100%?")
nrow(aggDT[propExpected == 1]) n <- nrow(aggDT[propExpected == 1])
n
``` ```
So, there are `r n` days with 100% data...
If we plot the mean then we will see which days get closest to having a full dataset. If we plot the mean then we will see which days get closest to having a full dataset.
```{r bestDaysMean, fig.width=8} ```{r bestDaysMean, fig.width=8}
ggplot2::ggplot(aggDT, aes(x = rDate, colour = season, y = meanOK)) + geom_point() ggplot2::ggplot(aggDT, aes(x = rDate, colour = season, y = meanOK)) + geom_point()
``` ```
Re-plot by the % of expected if we assume we _should_ have 25 feeders * 24 hours * 4 per hour (will be the same shape): Re-plot by the % of expected if we assume we _should_ have 25 feeders * 24 hours * 4 per hour (will be the same shape):
...@@ -319,6 +323,27 @@ Re-plot by the % of expected if we assume we _should_ have 25 feeders * 24 hours ...@@ -319,6 +323,27 @@ Re-plot by the % of expected if we assume we _should_ have 25 feeders * 24 hours
```{r bestDaysProp, fig.width=8} ```{r bestDaysProp, fig.width=8}
ggplot2::ggplot(aggDT, aes(x = rDate, colour = season, y = 100*propExpected)) + geom_point() + ggplot2::ggplot(aggDT, aes(x = rDate, colour = season, y = 100*propExpected)) + geom_point() +
labs(y = "%") labs(y = "%")
aggDT[, rDoW := lubridate::wday(rDate, lab = TRUE)]
h <- head(aggDT[season == "Spring"][order(-propExpected)])
kableExtra::kable(h, caption = "Best Spring days overall",
digits = 3) %>%
kable_styling()
h <- head(aggDT[season == "Summer"][order(-propExpected)])
kableExtra::kable(h, caption = "Best Summer days overall",
digits = 3) %>%
kable_styling()
h <- head(aggDT[season == "Autumn"][order(-propExpected)])
kableExtra::kable(h, caption = "Best Autumn days overall",
digits = 3) %>%
kable_styling()
h <- head(aggDT[season == "Winter"][order(-propExpected)])
kableExtra::kable(h, caption = "Best Winter days overall",
digits = 3) %>%
kable_styling()
``` ```
This also tells us that there is some reason why we get fluctations in the number of data points per hour after 2003. This also tells us that there is some reason why we get fluctations in the number of data points per hour after 2003.
......
This is pdfTeX, Version 3.1415926-2.5-1.40.14 (TeX Live 2013) (format=pdflatex 2020.4.15) 8 JUL 2020 22:59 This is pdfTeX, Version 3.1415926-2.5-1.40.14 (TeX Live 2013) (format=pdflatex 2020.4.15) 9 JUL 2020 00:11
entering extended mode entering extended mode
restricted \write18 enabled. restricted \write18 enabled.
%&-line parsing enabled. %&-line parsing enabled.
......
...@@ -54,7 +54,7 @@ addSeason <- function(dt,dateVar,h){ ...@@ -54,7 +54,7 @@ addSeason <- function(dt,dateVar,h){
} }
getData <- function(f,update){ getData <- function(f,updateData){
# gets the data # gets the data
dt <- data.table::fread(f) dt <- data.table::fread(f)
dt[, rDateTime := lubridate::as_datetime(Time)] # the dateTime is now called Time!!! dt[, rDateTime := lubridate::as_datetime(Time)] # the dateTime is now called Time!!!
...@@ -120,7 +120,7 @@ saveData <- function(dt, which){ ...@@ -120,7 +120,7 @@ saveData <- function(dt, which){
} }
} }
makeReport <- function(f,version, type = "html"){ makeReport <- function(f,version, type = "html", updateReport){
# default = html # default = html
message("Rendering ", f, ".Rmd (version: ", version, ") to ", type) message("Rendering ", f, ".Rmd (version: ", version, ") to ", type)
if(type == "html"){ if(type == "html"){
...@@ -149,14 +149,14 @@ makeReport <- function(f,version, type = "html"){ ...@@ -149,14 +149,14 @@ makeReport <- function(f,version, type = "html"){
# Set the drake plan ---- # Set the drake plan ----
my_plan <- drake::drake_plan( my_plan <- drake::drake_plan(
origData = getData(dFile, update), # returns data as data.table. If you edit 'update' in any way it will reload - drake is watching you! origData = getData(dFile, updateData), # returns data as data.table. If you edit 'update' in any way it will reload - drake is watching you!
uniqData = makeUniq(origData), # remove duplicates uniqData = makeUniq(origData), # remove duplicates
wideData = toWide(uniqData), wideData = toWide(uniqData),
saveLong = saveData(uniqData, "L"), # doesn't actually return anything saveLong = saveData(uniqData, "L"), # doesn't actually return anything
saveWide = saveData(wideData, "W"), # doesn't actually return anything saveWide = saveData(wideData, "W"), # doesn't actually return anything
# pdf output fails # pdf output fails
#pdfOut = makeReport(rmdFile, version, "pdf"), # pdf - must be some way to do this without re-running the whole thing pdfOut = makeReport(rmdFile, version, "pdf", updateReport), # pdf - must be some way to do this without re-running the whole thing
htmlOut = makeReport(rmdFile, version, "html") # html output htmlOut = makeReport(rmdFile, version, "html", updateReport) # html output
) )
# see https://books.ropensci.org/drake/projects.html#usage # see https://books.ropensci.org/drake/projects.html#usage
......
Source diff could not be displayed: it is too large. Options to address this: view the blob.
This diff is collapsed.
...@@ -3,7 +3,9 @@ ...@@ -3,7 +3,9 @@
# Set up ---- # Set up ----
startTime <- proc.time() startTime <- proc.time()
update <- "yes" # edit this in any way (at all) to get drake to re-load the data updateData <- "yes" # edit this in any way (at all) to get drake to re-load the data
updateReport <- "yes" # edit this to force re-render of .Rmd
library(drake) library(drake)
# use r_make to run the plan inside a clean R session so nothing gets contaminated # use r_make to run the plan inside a clean R session so nothing gets contaminated
drake::r_make(source = "_drakeCleanFeeders.R") # where we keep the drake plan etc drake::r_make(source = "_drakeCleanFeeders.R") # where we keep the drake plan etc
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment