Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
![Drone Processing Pipeline logo](https://github.com/az-digitalag/Drone-Processing-Pipeline/raw/07b1edc34a1faea501c80f583beb07f9d6b290bb/resources/drone-pipeline.png =100x100) # Drone Pipeline # A processing pipeline for drone captured plant phenomic information. This document is intended to provide a high level context on the drone pipeline and provide information on how to use it. ## Overview ## -------------- The goals motivating the drone pipeline are (in no particular order): - common processing: provide components that are reusable in multiple environments - dynamic work flows: mix common and unique processing components to create meaningful processing pipelines - scalable work flows: use scalable architecture as needed, in the right places, to return results faster The drone pipeline effort is part of the larger TERRA REF project and the source code resides on [TERRA REF](https://github.com/terraref/drone-pipeline "TerraRef GitHub repository link") GitHub site. Additional code and information can be found on the [Drone Pipeline](https://github.com/az-digitalag/Drone-Processing-Pipeline "Drone Pipeline GitHub respository link") portion of our [Digital Agriculture](https://github.com/az-digitalag "Digital Agriculture GitHub organization") GitHub site. Below is an overview diagram of the data processing pipelines. ![Drone Processing Pipeline Overview][1] ## Processing Steps ## ---------------------- At its simplest level, the drone pipeline has the following steps: 1. Make captured and configuration data available 2. Trigger the start of processing 3. Process data until the workflow is complete 4. Make the result of the workflow available Of course, there are many details the above steps gloss over. Details such as: how does one makes data available, and what does it mean to make it available? In our approach we use the [Scientific Filesystem](https://sci-f.github.io/) based on Docker to package up the workflows and other functionality. This approach requires the data processed and produced to be local to the running Docker image. This can be done by using Docker volumes to make data available. See the [Docker volumes](https://docs.docker.com/storage/volumes/) pages for more information. ## Transformers ---------------- Transformer are worker processes that perform specific units of work in our workflow. There are many alternative names for transformers, such as "nodes". We use the term transformers to indicate their specific [conceptual basis](https://agpipeline.github.io/transformers/transformers) and that they are used in our workflows. All workflows consist of one or more transformers. Transformers work in sequence, and possibly together, to complete a workflow. Most transformers have different input requirements that need to be satisfied before they can be successfully run. ## Workflows ## --------------- Information on [workflows](https://osf.io/xdkcy/wiki/Workflows/) can be found here, on this site A workflow defines the order of transformer events. Workflows can have implicit or explicit components. An implicit workflow component are one or more transformers that are automatically triggered by the right conditions, an uploaded file for example. An explicit workflow component is one that is configured to be started after a separate component instance has completed its work. ## Future Work ## There is a [roadmap on Zenhub](https://app.zenhub.com/workspaces/drone-processing-pipeline-5e7e97f39771620e1b5a8893/roadmap) for this pipeline's planned development. ## Acknowledgements ## ---------------------- Many people have put a lot of work into making this project happen. Our [Acknowledgement](https://osf.io/xdkcy/wiki/Acknowledgements/) page lists people and projects that have made contributions. If you feel that you have been left out of our Acknowledgements, or if you want to be removed, [let us know](mailto:schnaufer@email.arizona.edu)! [1]: https://mfr.osf.io/export?url=https://osf.io/w9n74/?action=download&mode=render&direct&public_file=True&initialWidth=684&childId=mfrIframe&parentTitle=OSF%20%7C%20CIMMYT%20Drone%20Pipeline.png&parentUrl=https://osf.io/w9n74/&format=2400x2400.jpeg
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.