Dynamic branching over exported files
tar_export.RdDynamic branching over exported files. This has similar functionality to
tarchetypes::tar_files() except that the upstream target does not automatically
become invalidated each time the pipeline is run.
Usage
tar_export_raw(
  name,
  command,
  tidy_eval = targets::tar_option_get("tidy_eval"),
  packages = targets::tar_option_get("packages"),
  library = targets::tar_option_get("library"),
  format = c("file", "url", "aws_file"),
  repository = targets::tar_option_get("repository"),
  iteration = targets::tar_option_get("iteration"),
  error = targets::tar_option_get("error"),
  memory = targets::tar_option_get("memory"),
  garbage_collection = targets::tar_option_get("garbage_collection"),
  deployment = targets::tar_option_get("deployment"),
  priority = targets::tar_option_get("priority"),
  resources = targets::tar_option_get("resources"),
  storage = targets::tar_option_get("storage"),
  retrieval = targets::tar_option_get("retrieval"),
  cue = targets::tar_option_get("cue")
)
tar_export(
  name,
  command,
  tidy_eval = targets::tar_option_get("tidy_eval"),
  packages = targets::tar_option_get("packages"),
  library = targets::tar_option_get("library"),
  format = c("file", "url", "aws_file"),
  repository = targets::tar_option_get("repository"),
  iteration = targets::tar_option_get("iteration"),
  error = targets::tar_option_get("error"),
  memory = targets::tar_option_get("memory"),
  garbage_collection = targets::tar_option_get("garbage_collection"),
  deployment = targets::tar_option_get("deployment"),
  priority = targets::tar_option_get("priority"),
  resources = targets::tar_option_get("resources"),
  storage = targets::tar_option_get("storage"),
  retrieval = targets::tar_option_get("retrieval"),
  cue = targets::tar_option_get("cue")
)Arguments
- name
 Symbol, name of the target. A target name must be a valid name for a symbol in R, and it must not start with a dot. Subsequent targets can refer to this name symbolically to induce a dependency relationship: e.g.
tar_target(downstream_target, f(upstream_target))is a target nameddownstream_targetwhich depends on a targetupstream_targetand a functionf(). In addition, a target's name determines its random number generator seed. In this way, each target runs with a reproducible seed so someone else running the same pipeline should get the same results, and no two targets in the same pipeline share the same seed. (Even dynamic branches have different names and thus different seeds.) You can recover the seed of a completed target withtar_meta(your_target, seed)and runset.seed()on the result to locally recreate the target's initial RNG state.- command
 R code to run the target.
- tidy_eval
 Logical, whether to enable tidy evaluation when interpreting
commandandpattern. IfTRUE, you can use the "bang-bang" operator!!to programmatically insert the values of global objects.- packages
 Character vector of packages to load right before the target builds or the output data is reloaded for downstream targets. Use
tar_option_set()to set packages globally for all subsequent targets you define.- library
 Character vector of library paths to try when loading
packages.- format
 Character of length 1. Must be
"file","url", or"aws_file". See theformatargument oftargets::tar_target()for details.- repository
 Character of length 1, remote repository for target storage. Choices:
"local": file system of the local machine."aws": Amazon Web Services (AWS) S3 bucket. Can be configured with a non-AWS S3 bucket using theendpointargument oftar_resources_aws(), but versioning capabilities may be lost in doing so. See the cloud storage section of https://books.ropensci.org/targets/data.html for details for instructions."gcp": Google Cloud Platform storage bucket. See the cloud storage section of https://books.ropensci.org/targets/data.html for details for instructions.
Note: if
repositoryis not"local"andformatis"file"then the target should create a single output file. That output file is uploaded to the cloud and tracked for changes where it exists in the cloud. The local file is deleted after the target runs.- iteration
 Character of length 1, name of the iteration mode of the target. Choices:
"vector": branching happens withvctrs::vec_slice()and aggregation happens withvctrs::vec_c()."list", branching happens with[[]]and aggregation happens withlist()."group":dplyr::group_by()-like functionality to branch over subsets of a data frame. The target's return value must be a data frame with a specialtar_groupcolumn of consecutive integers from 1 through the number of groups. Each integer designates a group, and a branch is created for each collection of rows in a group. See thetar_group()function to see how you can create the specialtar_groupcolumn withdplyr::group_by().
- error
 Character of length 1, what to do if the target stops and throws an error. Options:
"stop": the whole pipeline stops and throws an error."continue": the whole pipeline keeps going."abridge": any currently running targets keep running, but no new targets launch after that. (Visit https://books.ropensci.org/targets/debugging.html to learn how to debug targets using saved workspaces.)"null": The errored target continues and returnsNULL. The data hash is deliberately wrong so the target is not up to date for the next run of the pipeline.
- memory
 Character of length 1, memory strategy. If
"persistent", the target stays in memory until the end of the pipeline (unlessstorageis"worker", in which casetargetsunloads the value from memory right after storing it in order to avoid sending copious data over a network). If"transient", the target gets unloaded after every new target completes. Either way, the target gets automatically loaded into memory whenever another target needs the value. For cloud-based dynamic files (e.g.format = "file"withrepository = "aws"), this memory strategy applies to the temporary local copy of the file:"persistent"means it remains until the end of the pipeline and is then deleted, and"transient"means it gets deleted as soon as possible. The former conserves bandwidth, and the latter conserves local storage.- garbage_collection
 Logical, whether to run
base::gc()just before the target runs.- deployment
 Character of length 1, only relevant to
tar_make_clustermq()andtar_make_future(). If"worker", the target builds on a parallel worker. If"main", the target builds on the host machine / process managing the pipeline.- priority
 Numeric of length 1 between 0 and 1. Controls which targets get deployed first when multiple competing targets are ready simultaneously. Targets with priorities closer to 1 get built earlier (and polled earlier in
tar_make_future()).- resources
 Object returned by
tar_resources()with optional settings for high-performance computing functionality, alternative data storage formats, and other optional capabilities oftargets. Seetar_resources()for details.- storage
 Character of length 1, only relevant to
tar_make_clustermq()andtar_make_future(). Must be one of the following values:"main": the target's return value is sent back to the host machine and saved/uploaded locally."worker": the worker saves/uploads the value."none": almost never recommended. It is only for niche situations, e.g. the data needs to be loaded explicitly from another language. If you do use it, then the return value of the target is totally ignored when the target ends, but each downstream target still attempts to load the data file (except whenretrieval = "none").If you select
storage = "none", then the return value of the target's command is ignored, and the data is not saved automatically. As with dynamic files (format = "file") it is the responsibility of the user to write to the data store from inside the target.The distinguishing feature of
storage = "none"(as opposed toformat = "file") is that in the general case, downstream targets will automatically try to load the data from the data store as a dependency. As a corollary,storage = "none"is completely unnecessary ifformatis"file".
- retrieval
 Character of length 1, only relevant to
tar_make_clustermq()andtar_make_future(). Must be one of the following values:"main": the target's dependencies are loaded on the host machine and sent to the worker before the target builds."worker": the worker loads the targets dependencies."none": the dependencies are not loaded at all. This choice is almost never recommended. It is only for niche situations, e.g. the data needs to be loaded explicitly from another language.
- cue
 An optional object from
tar_cue()to customize the rules that decide whether the target is up to date. Only applies to the downstream target. The upstream target always runs.
Examples
## File export
targets::tar_dir({
    targets::tar_script({
        library(hrmtargets)
    
        list(
            tar_export(
                example_export,
                jfmisc::exportCSV(iris,'iris.csv')
            )
        )
    })
    targets::tar_make()
    targets::tar_read(example_export)
})
#> • start target example_export_files
#> • built target example_export_files [2.596 seconds]
#> • start branch example_export_cc991457
#> • built branch example_export_cc991457 [0 seconds]
#> • built pattern example_export
#> • end pipeline [2.669 seconds]
#> 
#> example_export_cc991457 
#>              "iris.csv"