As covered in Pipelines, Jobs, and Stages section, a build is ultimately defined by a Job description. In its most common and simplest form, a Job is described by a name identifier, followed by an
image reference and one or more
image property defines the Docker container image in which the Job executes on. This often refers to a subset of a well-known operating system or a small-sized container, offering a strict set of programs.
Custom commands, where the automation and integration usually happens, are added as
script lines. The
script command-lines are just like shell commands targeting a terminal on a regular computer or server.
Let’s take a look at an example configuration file:
version: 1 jobs: hello: image: alpine script: - echo Hello world!
The configuration above defines a Pipeline. This particular pipeline is composed of a single Job named
hello, which prints the
Hello world! message in the Alpine Linux container console.
Let’s analyze another example configuration file:
version: 1 jobs: compile: image: node script: - npm install truffle - npm install - npx truffle compile
The configuration above defines a Pipeline intended to be executed against a valid Truffle project. It describes a single job, named
compile, based on the official Node.js Docker image. The job executes three commands, which can be read as follows:
- install the Truffle package from npm;
- install all packages defined in the project
package.json, located in the current directory;
- call Truffle for compiling the project, also expected to be located in the current directory.
compile Job assumes the current working directory is set to the project’s root directory. That is the default behavior, as every new build starts off from the connected repository root directory.
Multiple Jobs can be defined the same way as single jobs, as presented below:
version: 1 jobs: compile: image: node script: - npm install truffle - npm install - npx truffle compile test: image: node script: - npm install truffle - npm install - npx truffle test
A pipeline containing multiple Jobs is resolved asynchronously. That is, jobs are potentially executed in parallel and out of order.
In order to guarantee that jobs are executed in a specific order, it is necessary to add the notion of Stages as shown in the following example:
version: 1 jobs: compile: image: node script: - npm install truffle - npm install - npx truffle compile test: image: node script: - npm install truffle - npm install - npx truffle test stages: - custom_sequence: jobs: - compile - test
custom_sequence Stage is an arbitrary identifier which is responsible to group up one or more Jobs for sequential execution, as defined by the ordered list.
It is also possible to define multiple Stages by editing the previous example as follows:
[...] stages: - compile: jobs: - compile - test: jobs: - test
The previous example results in a Pipeline composed of two Stages, each containing a single Job. Execution order is expected to be
compile first, then