Jenkins

1) CI/CD

Continuous Integration (CI)
-This is software development practice in which members of a team integrate their work frequently, at least daily, leading to multiple integrations per day.


Base code -> checkout -> changes <-> Local Tests
changes -> base code


Continuos Delivery (CD)
-This is software development discipline where software is built in a manner that allows for deploying to customers at any time.


Continuos Deployment
-This extends Continuos Delivery by automating the deployment process so that code is automatically deployed to production after it passes automated testing.


2) Jenkins overview

- jenkins is a self-contained, open source automation server which can be used to automate all sorts of tasks such as building, testing, and developing software

-jenkins can be installed throuh native system packages, Docker, or even run standalone by any a machine with the Java Runtime Environment installed.

3) Installation of Jenkins

Infrastrature challenges
-install and configure Jenkins master
-install and configure  all required dependencies
-install and configure plugins
-create and connect slaves (provision agents using swarm plugin - https://wiki.jenkins.io/display/JENKINS/Swarm+Plugin )
-add JDK installation
-configure authentication
-create credentials
...
-configuration mangment tools
  *Ansible
  *Puppet
  *Chef
  *Salt
  8etc.
-Groovy console
-Jenkins CLI
-Slave managment
  *Swarm plugin
  *Docker plugin
  *SSH slaves
- Automate agent provisioning and make them ephemeral



sudo su # this prevents us fom having to issue sudo each time
yum install -y wget
wget -O /etc/yum.repos.d/jenkins.repo https://pkg.jenkins.io/redhat/jenkins.repo
rpm --import https://pkg.jenkins.io/redhat/jenkins.io.key
yum install jenkins java-1.8.0-openjdk-devel git -y
systemctl enable jenkins && systemctl restart jenkins


apt-get install wget
wget -q -0 - https://pkg.jenkins.io/debian/jnkins-ci.org.key | sudo apt-key add -
echo deb https://pkg.jenkins.io/debian-stable binary/ | sudo tee /etc/apt/sources.list.d/jenkins.list
apt-get update
apt-get install jenkins
systemctl enable jenkins && systemctl restart jenkins
ufw allow 8080
ufw status


/var/lib/jenkins/secrets/initialAdminPassword



yum install -y java-1.8.0-openjdk-devel
yum install -y jenkins


Using docker
docker run -p 8080:8080 -p 50000:50000 \
-v $PWD/jenkins:/var/jenkins_home \
jenkins/jenkins:lts

Use the jenkins/jenkins Docker images


FROM jenkins/jenkins:lts
COPY plugins.txt /usr/share/jenkins/ref/plugins.txt
RUN /usr/local/bin/install-plugins.sh < /usr/share/jenkins/ref/plugins.txt


4) Understanding and configure Jenkins tools

5) Creating and working with Users for Jenkins

6)  Jobs in Jenkins:

- A job is any runnable task that is controlled by Jenkins

- The Jenkins official documentation indicates that the term Jobs has been replaced by the term Project

-

new items pane

-freestyle project
* this is the central feature of Jenkins. Jenkins will build your project , combining any SCM with any build system, and this can be even used for something other than software build.
*this is the most common type of project. The build step for this type of project normally executes a shell (Linux) or batch (Windows) command.

-pipeline
* orchestrator long-running activites that can span multiple build slaves. Suitable of building pipelines (formely known as workflows) and/or organizing complex activities that do not easily fit in freestyle job type
*this type of project used to be called a workflow. Theese projects are normally written in Jenkins Specific Language (DSL).
*these types of projects are for things that do not fit in a freestyle project, because they are too complicated or span multiple nodes.

-multi-configuration project
* suitables for projects that need a large number of different configurations, such as testing on multiple environments
*for project on multiple OSes
*this is for projects that will be tested on multiple environments, and requiredd different configurations, depending on those environments.

-folder
* creates a container that stores nested items in it. Useful for grouping things togheter. Unlike view, which is just a filter, a folder creates a separate namespace, so you can have multiple things of the same name as long as they are in diferent folders.
-this provides a method to group projects together. This is not technically a project. It acts as a type of directory structure for the projects, and the folder name becomes part of the path of the projects.


-github organization
* scans a github organization (or user account) for repositories matching  some different markets.
*this type of project can use the source control platform's organization and allow Jenkins to act on Jenkinsfiles stored within the organization's repositories.
-default plugins is github

-multibranch pipeline
*creates a set of Pipeline projects according to delected branches in one SCM repository.
*in this type of project. Jenkins uses a Jenkinsfile to mark repositories. If a branch is created in that repository, Jenkins will make a new project in Jenkins for that branch.



Job/Project Scope

-This include all of the items that are part of that particular Job/Project. In some cases, there are  Global libraries that are brought into the scope of a project simply by being included. Other items that are declared within a project only exist in that project's scope, and are not available as ashared resource.


Jobs - Freestyle project
-General -> This job is parametrized
-Build Environments -> Add timestamps to the Console Output
-Build Environments -> Delete workspace before build starts // deleting artifacts from workspaces
-Build -> Execute shell


7) Build

- A build is the result of a single execution of a project

source -> (pull) -> branch -> (change) -> commit  <-> feedback <-(integration)--> test -> (integration pass) -> BUILD

Jenkins part starts from feedback

BUILD
->check SCM for changes -> TRIGGER
->clean checkout
->build code
->perform testing
->check pass fail
->provide feedback
->if pass produce build -> TRIGGER


Build trigger
-build triggers -> build after other projects are built -> trigger only if build a stable


Artifacts and repositories:

-this refers to immutable files that are  generated during a build or a pipeline run. These are archived on the Jenkins master for later retrieval.
-A single build can have multiple artifacts associated with it. these can include jar files, war files, configuration files, and other generated assets.
-Artifacts are maintained in a repository. This can be on the Jenkins master or in a Source Control Manager (SCM)
-Repositories hold items that need to be retrieved. These items can include source code, compiled code artifacts and configuration files.

Build Tools

-this are the software that actually performs the build portion of the pipeline.
-Build tools can include Maven, Ant, and shell scripting.
-Configuration varies by build tool, but the processes are similar:
1. Start Jenkins and install required plugins
2. Perform global configuration steps.
3. Create a Job/Pipeline that utilizes the build tool
4. Update the tools configuration files: POM, XML, .config etc.



Source Control Manager (SCM)

-A Source Control Manager (SCM) is software that is used to track changes in code.
-changes in code, revisions, are timestamped and include the identity of the person that made change.
-Changes can be tracked or rolled back as needed. Versions of the code can be compared, stroed, and merged with other versions.
-Some examples are Git, Subversion, Mercurial, and Perforce.
-Cloud based SCM's such as Git-hub or Gitlab can be leveraged as offsite repositories for code.
-Jenkins Changelogs are used for tracking changes in builds.


Source Control  Managment
-Source Control Managment -> git  -> build -> execute shell -> chmod +x file.txt ./file.txt


8) Handling Code from Source Control
a) Incremental update - repository -> pull changes into local branch - Incremental Branch
b) Clean checkout - repository -> delete local and clone - clean branch

Checking in code to Source control
-checking in code is the process of pushing changes to a repository.
-checking in code is the same as a code commit.
-as part of the CI methodology, code should be checked in often.
-all code commits should have a descriptive message that indicates what changes the commit includes.

9) Infrastructure As Code
-This is the process of managing and provisioning resources via configuration files.
-It allows mechine configurations to be maintained in source control. Those configurations can then be rolled back or versioned.


Branching and Merging Strategies
-These are the methods of checking in  code to source control in such a way that a source of truth i determined.

A           ->          A                        A
B           ->          B                         B
C           ->          E           =>          E
D           ->          D                         D
remote                local                     result when local is truth

One repository is determined ti be the source of truth. Conflicting changes are resolved in favor of that source.




10) Testing
-Testing is the process of checking code to ensure that it is working as designed, or that its output is what is expected


function add_5(a){
   return = a + 5
}

@Test
function test_add_5(){
  this = add_5(5)
  assert.Equals(10, this(5), '5_5 must equal 10')
}


Types of Tests

a) Unit Test
-in this type of test individual components (classes, methods, modules) are tested to ensure that outputs are as expected.

some_data = access_database.get('username', 'address', 'phone_number')

-test to ensure that the data contained in some_data is what is expected:

@test {
  Ensure phone_number is a phone number
  Ensure address is an address
  Ensure username is a username
}


b) Smoke Test

-more generalized than a unit test, this type of test checks the main functionality if the software o ensure that it is stable enough for further testing:
*Does it load?
*Does it crash when the save button is clicked?
*Do the menus work?


c) Verification / Functional Test

-Verification testing seeks to answer the question "Did we satisfy the build requirements?" Autmeted Verification testing is used to streamline this process.
-Functional testing checks a specific function of the software. This seeks to answer "does this feature work" or "Can a user do this?"


d) Acceptance Test

-This is the handoff test of the software to the client. It is normally done by the client to ensure that the software meets their expectactions.



11) Notifications

-Notifications are critical to an automated process; they give you active feedback to the status of processes within the project
- If a build fails, or if you need to manually approve a deployment, you can configure a notification to be sent.
-Types of notifications include E-mail, SMS, and several types of instant messaging that are configurable via plugins.

12) Distributed builds 

-Distributed builds jobs in which the executor of the build is located on an agent (node) that is separate from the master
-the master acts as the controller for the build , running specific builds on specific agents, allowing for parallelism and greater ease in multiconfiguration pipelines.
-if you have 3 versions of the software to perform 5 unit tests against, this can be done in one parallel pass, resulting in 5 tests on each agent rather than 15 tests on the master
-the master acts as the controller for the build, running specific  builds on specific agents, allowing for parallelosm and greater ease in multiconfiguration pipelines.


                                                   Firefox

Master                                        Chrome                                     Repository

                                                   Safari



-Nodes with specific configurations can the tagged so that pipeline steps specific to that configuration are directed to the node
- In most cases Artifacts, progress reports and build results are sent back to the master repository. Storage on the master must be considered for this reason.
-Master/Agent communication via SSH (preferred) or JNLP (TCP or HTTP)
-Agents should be "fungible" (replaceable). this means that local configuration on the agent should be kept  to a minimum and global configuration  on the master should be preferred.


13) Plugins

-Plugins are extensions to Jenkins that add to its functionality
-Jenkins defines interfaces / abstract classes that model a part of the build system. These define what needs to be  implemented, and Jenkins plugins extend that implementation


Recommended Plugins
-During installation you  have the choice to install recommended plugins or select plugins.
-The recommended plugins :
*Ant Plugin
*Credentials Binding Plugin
*LDAP Plugin
*Pipelin: STage View Plugin
*Pipeline
(OWASP Markup Formatter Plugin)
*Email Extension Plugin
*Mailer Plugin
*SSH Slaves plugin
*Github Organization Folder Plugin
(build timeout plugin)
*Git plugin
(Matrix Authorization strategy Plugin)
*subversion plug-in
*Workspace Cleanup Plugin
*CloudBees Folders Plugin
*Gradle plugin
(PAM Authentication plugin)
*Timestamper


Plugin Manager
-managing plugins
-the plugin manager can also provide important information about a plugin including version conflicts and issues that may arise update or installation



14) Jenkins Rest API
-Jenkins provide a machine consumable REST style API for programmatically interacting the Jenkins server.
-Documentation on this is located on the Jenkins server itself at http://serveraddress/api
-http://localhost:8080/api

API Interaction
-Jenkins exposes a REST-API interaction by sending REST methods to the Jenkins URL
-this will build the job that is named:

curl jenkinsurl:8080/jobs/jobname/buildnumber. -user usernam:password

-this is an example of sending json values for a parametrized build:

curl jenkinsurl:8080/jobs/jobname/buildnumber, -user username:password -data-urlencode json='{"parametr": [{"name":"id"},{"somevalue":"abcd"}]}'


Why use the API?

-The API can be used to create jobs in a programmatic way. Once the job is defined as code it can be checked into source and versioned and managed.
-Jobs can be copied and so once a pipeline has been standardized it can be reproduced without manual configuration.
-The API can be leveraged to facilitate reporting on he status of the build queue, and the load on the Jenkins servers, to determine if there are issues
-If provides administrators with a method to restart Jenkins without requiring them to have SSh access to the underlying infrastructure on which Jenkins is running



15) Security

a) Authentication vs Authorization
-Authentication is the process of verifying who you are
-Authorization -s the process of determining what you are allowed to do

b) Matrix Security
-This provides the ability to configure security based on different sections or context
-this can be Global or Project based
-Inheritance is selectable from the drop down menu in the project:

*Inherit permissions from parent ACL

**The first option inherit from parent is for projects that are in a folder or are the child of another object.
**The description of this type is important as this indicates where permissions can be added in the chain,

*Inherit globally defined permissions

**The second option inherit globally defined permissions is for projects that are in a folder , or are the child of another object  but do not  want the permissions from the folder or parent only global permissions
**The discription of this type is important as this indicates where permissions can be  added  in the chain.

*Do not inherit permission grants from other ACLs

**The last option do not inherit permission grants from inheriting any permissions from either the global settings or parent items.
** The description  of this type is important as this indicates where permissions can be added in the chain




c) Auditing

-Auditing is the process of verifying that the access permissions are working as indended
-This ensures that the best practice of least permissions required is maintained.
-Remember that Jenkins is an explicit allow model and that there  is no deny ; if something is not explicity allowed then it is denied.
-Also, Jenkins permissions are additive and global -> parent -> job is how those allows are stacked. If something is allowed at a level above, and inheritance is enabled, then it is allowed in the levels below.


d) Credentials
- A credential is any value that provides access to a restricted resource, this is  also known as a secret. These are used  by Jenkins to access restricted resources.
-Examples of credential types include username and password, SSH  username and private key, secret files, secret tokens, and certificates.
-A credential provider is a location  that has been configured for Jenkins to retrieve credentials





16) Artifacts and Fingerprints

a) Artifacts
-an artifacts is an immutable file that is generated during a build or pipeline run
-these are used to provide the compiled project to end users, facilitate the testing process, create classes , and prevent rebuilding of known good code.
-Artifacts of compiled code are also used as a way to version the software
-Artifacts are stored in a repository fingerprinting  is used to determine which  build produced that artifact
-On the jenkins master the default location of the archive repository is:
jenkins  root/jobs/buildname/builds/lastSuccesfulBuild/archive
-Retention policies can be configured to prevent bloating of the repositories

b) Fingerprints
-a globally unique hash that is used  to track Artifacts or other  entities across multiple pipelines or projects
-stored in the Jenkins home directory in the fingerprints directory
-in the fingerprint print directory the files are stored in a hierarchy that is based on the first characters of the checsum
/var/lib/jenkins/fingerprints/98/b8
-Post-build Actions - fingerprinting must be enabled on the project configuration screen
-Spcify what artifacts to archive and which artifacts to archive and which artifacts to fingerprint
-the contents of the fingerprint file for this build is shown below:

<?xml version='1.1' encoding='UTF-8'?>
<fingerprint>
  <timestamp>2018-09-19 19:20:02.644 UTC</timestamp>    <-last build time
  <orginal>
     <name>TestProject</name>                                             <-project name of the that produce the file
     <number>4</number>                                                      <-build number
  </orginal>
  <md5sum>98b83a060946bed8952ff73e263a78be</md5sum>  <MD5 hash of the file
  <fileName>jout.txt</fileName>
  <usage>
     <entry>
       <string>TestProject</string>                       <-references, other places this resource has been used
       <ranges>4</ranges>
     </entry>
   </usages>
   <facets/>






17) Pipelines

a) Pipeline Concepts
b) Upstream, Downstream and Triggers
c) Parameters

pipeline {
  agent any
  parameters {
    string(description: 'foo', name: 'bar')
  }
  environment {
    baz = params.bar
  }
}



d) Promotions
e) Pipeline (the artist formely known as "Workflow"
f) Pipeline Multibranch and Repository Scanning
g) Pipline Global Libraries
h) Parameterizing builds
i) Build Triggers
j) parallel

parallel 'end-to-end-tests': {
  // E2E
  node ( 'e2e-node') {...}
}, 'performance-tests': {
  // Perf tests
  node ( 'perf-test-node') {...}
}


def splits = splitTests count(2)
def branches = [:]
for (int i = 0; i , splits.size(); i++) {
  def index = i
  branches["split${i}"] = {
    def exclusions = splits.get(index);
    // mvn test excludes exclusions
  }
}
parallel branches


stash name: 'sources', includes: 'pom.xml,src/'
def splits = splitTests count(2)
def branches = [:]
for (int i = 0; i , splits.size(); i++) {
  def index = i
  branches["split${i}"] = {
    node('remote') {
      unstash 'sources'
      def exclusions = splits.get(index);
      writeFile file: 'exclusions.txt', text: exclusions.join("\n")
      "${tool 'M3'}/bin/mvn -B -Dmaven.test.failure.ignore test"
      junit 'target/surefire-reports/*.xml'
    }
  }
}
parallel branches

k) declarative vs scripted

declarative
-easy to easy
-concise
-validation before running
-Visual Editor

scripted
-bit harder due to more Groovy
-boilerplate
-try-commit-retry-commit-loop
-Groovy editor

18) Jenkinsfile


# pipeline 1
pipeline {
  agent any # do all steps on agents
  stages {
    stage( 'Hello World') {
      steps {
        sh 'echo Hello World'
      }
    }
  }
}

#pipeline 2
pipeline {
  agent none # do all steps on masters
  stages {
    stage( 'Do work') {
      agent any
      steps {...}
    }
    stage( 'Input requested') {
      agent none
      steps {
        input message: 'user input'
      }
    }
  }
}


environment variables:
-https://hudson.eclipse.org/webtools/env-vars.html/
- do not override $env
- use:
withEnv(["HELLO=world"]){
  sh "echo HELLO"
}

pipeline {
  agent any
  environment {
    HELLO = 'world'
  }
  stages {
   ...
  }
}


19) Groovy script

18) CD as Code

a) Distributed Builds Architecture
b) Replaceable Agents
c) Master Agents
d) Master Agent Connectors and Protocol
e) Tool Installation on Agents
f) Cloud Agents
g) High Aviability

19) The Exam

a) What to ecpect
b) Where to go from here

c) Practice exam

20) Working with Maven project of JAVA to build + Compile

21) Working with Maven project of JAVA to build + Compile + Test

22) Working with Maven project of JAVA to build + Compile + Test + Report  Generation

23) Working with Pipeline Project

24) Working with Maven project of JAVA to build + Compile + Test + Report  Generation with pipeline

25) Working with .NET project using Freestyle Jenkins

26) Working with .NET project using Pipeline Jenkins

27) Working with Agents/Nodes

28) Cross browser testing

29) Parallel testing

30) Working with Docker

31) Job DSL

a) Domain Specific Language
-to specify job configuration
-Groovy based DSL
- job/view/dashboard configuration
-developed as "normal" code in IDE with
  * auto-completion
  * type check
  * groovy magic if needed
-outside jenkins instance
- DSL documentation - https://github.com/jenkinsci/job-dsl-plugin/wiki/Job-DSL-Commands , https://github.com/jenkinsci/job-dsl-plugin/wiki/Job-referance
- comperehensive support for Jenkins Core stuff
-extensive support for addional plugins
  *over 180 plugins as of version 1.44
  *active community - continous  flow of new pull requests
-powerful configuration block for
  * not yet supported features
  * custom stuff
- virtually everything possible in XML should be achievable
- Jenkins Chef DSL cookbook https://github.com/erichelgeson/jenkins-chef-dsl



DSL example

job('gr8 example') {
  scm {
    github 'sheehan/job-dsl-gradle-example'
  }
  triggers {
    scm 'H/5 * * * *'
  }
  steps {
    gradle 'clean test'
  }
  publishers {
    archiveJunit 'build/test-results/**/*.xml'
    extendedEmail 'sth@gmail.com'
  }
}

Dynamic job DSL example

String repo = 'gr8day/mobile-app'

URL branchUrl = "https://api.github.com/repos/$repo/branches".toURL()
List branches = new JsonSlurper().parseTest(branchUrl.text)

branches.each { branch ->
  
  String safeBranchName = branch.name.replaceAll('/', '-')
  
  job ("$repo-$safeBranchname-build"){
    scm {
      github repo, branch.name
    }
    triggers {
      scm 'H/5 * * * *'
    }
    steps {
      gradle 'clean test'
    }
  }
}

job DSL - benefits
-source code instead of XML or GUI
-single source of truth
-manageable jobs and views
  *backed by SCM
  *review able - possibly with pull requests
-testable
  * automatic "unit"  testing
  * pre-production enviorment
-scalable
  * hundreds of jobs created/modified in seconds

job DSL - drawbacks/limitions
-quite steep learning curve
-can become hard to understand for complex configurations
-small error in DSL can remove some/all jobs
  * can be easily recreated, but without execution  history
-not suitable for global Jenkins configuration managment
  * credentials, machine provisioning, Jenkins and plugin update, ...



Job DSL plugin
1. Developer updates DSL script locally
2. Developer pushes changes
3. SCM change triggers seed job
4. Seed job runs DSL
5. Seed job updates/creates/deletes

Create seed job
- freestyle project
-Build -> Process Job DSls -> Use the provided DSL script

100.time {
  job('example' + it) {}
}

Job DSL Gradle example - https://github.com/sheehan/job-dsl-gradle-example

b) Jenkins plugin
-to transform  configuration DSL into real jobs in Jenkins
-installed on Jenkins instance
-used in seed jobs on Jenkins
-leverages DSL configuration
-updates jobs & views in Jenkins
  * to bring thrm to desired state
  * XML configuration files modification

Provision your plugins

docker exec -it [containerId] /usr/local/bin/install-plugins.sh [plugins]

& restart Jenkins

http://[jenkinsurl]/safeRestart



32) Continuous Delivery

- Clearlty defined way how to transform source code into project deployed to production
  * a set of steps arranged into pipeline
  * unified way for various projects/variants/realms
-bunch of jobs triggering each other
-can be emulated with various plugins
  * Delivery Pipeline Plugin, Build Flow Plugin, Pipeline, ..
-no easy (and unified) way to setup
- usually even harder to maintain

Continuous Delivery - case study
- custom Continuous Delivery framework
  * on top of jenkins Job DSL
- one standardized way for Continuous Delivery
- reused in all projects in the company
- Ansible for infrastracture managment
- Rundeck for  deployment

33) Maven 
- do not use Maven Job 9 you can  use Freestyle project or Maven project

Komentarze

Popularne posty z tego bloga

Kubernetes

Helm

Ansible Tower / AWX