Salesforce jenkins

Get Our Latest Thoughts & Opinions

Why Continuous Integration?

Before we get to the guide how to set up Jenkins as on AWS we need to answer why do we even need a continuous integration server when doing Salesforce development? Why not keep using changesets to deploy stuff? The simple answer is automation, the bigger your Salesforce development team the more difficult it would be deploying changesets between sandboxes. It would be even more difficult when 2 developers are working on the same components. A continuous integration server would simply get the changes from each developer, deploy them to the shared sandboxes and will run all tests. When a developer wants to get all of the changes that were done by other developers they just pull all the recent changes from the code repository.

The best development practice is to make sure every developer uses their own development sandbox, then we also use 2 shared environments, one called staging which we use to run all tests and to do QA. The second shared environment is a full sandbox which we call PreProd this is where the customer performs UAT. Since Salesforce now allows to have 25 developer sandboxes on enterprise edition there’s no reason for more than 1 developer to be working on the same sandbox.

The process works by having a shared code repository, once a developer is done working on a feature the developer will commit the changes to the shared code repository. Jenkins will pick this up, deploy the code to the staging environment and run all tests. If a test failed, Jenkins will mark the deployment build as failed and the developer would have to fix the issue. Ideally, developers should run all tests on their local sandbox and not push their code if there are failing tests but this doesn’t always happen. If this is successful the code is also deployed to PreProd. You can select when and how you’d like to deploy to production. Developers can always pull the recent changes from the code repository and deploy it to their sandbox, this is how developers should get changes that others did.

Setting Up an AWS Server

We choose to host our Jenkins on AWS, because at the size we need AWS can be free and also because of the flexibility and scaling option AWS offers.

 1. The first step is to create your AWS account and login – https://aws.amazon.com

2. Select your region where you want AWS to host your server. Just choose the closest location to you, it doesn’t really matter for a CI server.

3. From the main AWS page select EC2

4. Then go to Instances section and click “Launch Instance”

5. Select Ubuntu image

6. Select instance Type. We recommend selecting micro since it’s free and you probably won’t need a bigger instance if you do it’s possible to change it later on.

7. Then Add storage

8. Choose if you want additional storage for your server. We went for 20GB but this is mainly if you have a big repository, if you don’t then this is not needed.

9. Give your instance a name. “Jenkins” would make sense but feel free to be creative.

Create a new security group. Add Inbound firewall rules.

 

 Allow only Ports 22, 80, You can limit your server to be accessed only by certain IPs. We allowed Jenkins to be accessed from anywhere since you will need a username and password anyway to access Jenkins.

It’s time to launch your server:

 Give a name to Key Pair and click in Download Key Pair. This is important to be able to access your server.

 Then Save Key Pair

 And Finally, Launch Instance.

Your instance is now set up and it has a public DNS address. Keep it as you’ll need it later. 

That’s it! our AWS server is up and running.

Installing Jenkins on the AWS Server

In order to install Jenkins on the AWS server, we’ll need to access our server. We downloaded earlier a .pem file. We’ll use it to access our server.

If you’re using Linux you can access the server by running this command

And then jump to step

If you’re using Windows we’ll need to use Putty to connect to our AWS server.

1. In windows, we need to convert the .pem file to .ppk file.

To convert we will use PuttyGen.

You can download PuttyGen from https://the.earth.li/~sgtatham/putty/latest/x86/puttygen.exe

2. Open PuttyGen and load the key ‘.pem’ key file.

3. Browse and look for your .pem file and hit open.

4. Then click save Private key.

Save without pass phrase.

Now once we have a .ppk file we need Putty to access our server.

You can download putty from https://the.earth.li/~sgtatham/putty/latest/x86/putty.exe

5. Open Putty and in Host Name put [email protected] (in the photo we put the instance IP there but it’s easier to use your DNS address)

This is the same DNS address that is highlighted earlier in section

6. Go to Auth section and Browse for the .ppk file.

7. Now go to session section and give a Name and save. This would allow you to connect to the server easier the next time.

8. Now click on “Open” to access your AWS server.

9. You should see this Windows opened.

Run the following commands to install Tomcat.

 Get latest Jenkins.war to install Jenkins

 Move Jenkins.war to root and then start Jenkins

 Now Tomcat is running on port with Jenkins. Now we’ll install ngnix to redirect port 80 to it

We’ll also need to edit the ngnix configurations file, use this command to open for edit mode the ngnix configuration file

Use “i” to start editing the content. You can delete all the content and paste the following text. At the end to save press Esc and then type “wq!” and hit enter.

Now to access your Jenkins use  http://yourAwsDnsAddress

Note 1:

You can use your own URL like http://jenkins.myCompany.com so it will be easy to access your server. You do that by adding a DNS record to your server.

Here are details how to do that: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/dynamic-dns.html

Note 2:

Our Jenkins is running without SSL which means the communication with it is not encrypted. You can create your own SSL certificate (or use one you already have) to make your Jenkins more secure. You can configure your ngnix server to do that.

Here are further details on how to do that:

https://www.digitalocean.com/community/tutorials/how-to-create-an-ssl-certificate-on-nginx-for-ubuntu 

When you access that URL you should see this page

 To get Jenkins Admin password using this command

Enter the password that is displayed and then you’ll get this

Click on install suggested plugins, some of them are very handy and one of them is the Ant plugin which we’ll need.

You’ll see this setup page

And then you’ll be able to create your first Jenkins user

And that’s it! Jenkins is ready to go.

Creating a build pipeline for Salesforce

Now we have Jenkins up and running. All that’s left to do is to add builds to deploy to Salesforce.

I’m assuming you have a code repository, somewhere like GitHub. If you don’t you definitely need to create one where you’ll store all of your Apex classes and Visualforce pages.

1. We’ll add a new folder to our repo called “ant” with a few files required for the deploy. It doesn’t have to be included in the folder but we prefer to have it there so that every developer that clones the repo has the ant deploy files. (Although we usually use Maven Mate with sublime so they’re not really needed).

To the Ant folder we’ll and the ant-salesforce.jar file you can download from Salesforce here:

https://gs0.salesforce.com/dwnld/SfdcAnt/salesforce_ant_zip

We’ll also add the standard package.xml file (in addition to the file we have in the “src” folder) to the ant folder.

Use this:

And the last file is the build.xml, use this file:

2. Before we create a build item in Jenkins we want to connect our Github to Jenkins. Go to “Manage Jenkins” -> “Configure System”. Under the “GitHub” section click on “Advanced” and then in “Additional actions” select “Convert login and password to token”. Select the “From login and password” radio button and enter your Github username as password. Then click “Create token credentials”. You should get a success message.

Now click on “Add Github Server” and from the “Credentials” drop-down add the token credentials you just created.

3. We also need to configure Git in Jenkins. Go to “Manage Jenkins” -> “Global Tool configuration” and under the Git section add “default” as the name and “git” as out path to git.

4. In Jenkins click on  “New Item”, enter “Staging Deploy” as the build name and select “Freestyle Project”

5. In the new build page:

Tick the “GitHub project” checkbox and put your repository URL, mine for this test is https://github.com/orweissler/SalesforceTest.git

Also, tick the “This project is parameterized” checkbox. Add a String parameter with the name of “SFUSER” and enter your Salesforce username in your staging sandbox.

Add another password parameter called SFPASS with your Salesforce password.

A password parameter called “SFTOKEN” with your Salesforce token.

And the last one, a string parameter called “SF_SANDBOXURL” with the value “test.salesforce.com”. All of these parameters will go into your build.xml file.

 

 6. Under the “Source Code Management” section select “Git”. In “Repository URL” enter your Github SSH URL, mine, for example, is “[email protected]:orweissler/SalesforceTest.git”.

In the credentials, click add, then select “SSH Username with private key” and paste your private Github SSH key. If you don’t have an SSH key on Github this is how you can create on and upload it to you Github https://help.github.com/articles/generating-an-ssh-key

Jenkins will tell you if it can access your git repository or not.

Under “Build Triggers” select “Build when a change is pushed to GitHub”.

7. We’re still not done! Under “Build” select “Add build step” and choose “Invoke Ant”. In “Targets” put “deploy_run_tests”.

Click “Advanced” and in “Build File” put “ant/build.xml”

Don’t forget to save!

8. Now we’ll create the same build for our PreProd environment. We’ll repeat the same steps in sections but we won’t select “Build when a change is pushed to GitHub”, we will enter our PreProd environment username, password and token. Last difference is instead of running the deploy_run_tests job we can just use the deploy_dont_run_tests ant job since this would be the follow up build, we know our tests pass.

9. Go back to Staging configuration and under “Post-build Actions” add “Build other projects”. Select your PreProd build and “Trigger even if the build is unstable” as we want to build PreProd as long as Staging tests pass and the build is successful.

And that’s it! Now every push to your repository by any developer would push the code automatically to your shared staging environment, make sure all tests pass and then if everything goes well it will push it to your PreProd environment so users can perform UAT on your features. You can customise your build to fit your needs, you can add a live deploy here as well and schedule it or start is manual.

Credits for helping with this article:

Bhrigun Prasad

 

Sours: https://pexlify.com/blog/setting-jenkins-on-aws-as-a-ci-server-for-salesforce

Jenkinsfile Walkthrough

The sample Jenkinsfile shows how to integrate your Dev Hub and scratch orgs into a Jenkins job. The sample uses Jenkins Multibranch Pipelines. Every Jenkins setup is different. This walkthrough describes one of the ways to automate testing of your Salesforce applications. The walkthrough highlights Salesforce CLI commands to create a scratch org, upload your code, and run your tests.

This walkthrough relies on the sfdx-jenkins-package Jenkinsfile. We assume that you are familiar with the structure of the Jenkinsfile, Jenkins Pipeline DSL, and the Groovy programming language. This walkthrough demonstrates implementing a Jenkins pipeline using Salesforce CLI and scratch orgs. See the CLI Command Reference regarding the commands used.

This workflow most closely corresponds to Jenkinsfile stages.

Define Variables

Use the keyword to define the variables required by Salesforce CLI commands. Assign each variable the corresponding environment variable that you previously set in your Jenkins environment.

Define the variable, but don’t set its value. You do that later.

Although not required, we assume that you used the Jenkins Global Tool Configuration to create the custom tool that points to the CLI installation directory. In your Jenkinsfile, use the tool command to set the value of the variable to this custom tool.

You can now reference the Salesforce CLI executable in the Jenkinsfile using .

Check Out the Source Code

Before testing your code, get the appropriate version or branch from your version control system (VCS) repository. In this example, we use the Jenkins command. We assume that the Jenkins administrator has already configured the environment to access the correct VCS repository and check out the correct branch.

Wrap All Stages in a withCredentials Command

You previously stored the JWT private key file as a Jenkins Secret File using the Credentials interface. Therefore, you must use the command in the body of the Jenkinsfile to access the secret file. The command lets you name a credential entry, which is then extracted from the credential store and provided to the enclosed code through a variable. When using , put all stages within its code block.

This example stores the credential ID for the JWT key file in the variable . You defined the earlier and set it to its corresponding environment variable. The command fetches the contents of the secret file from the credential store and places the contents in a temporary location. The location is stored in the variable . You use the variable with the command to specify the private key securely.

Wrap All Stages in a withEnv Command

When running Jenkins jobs, it’s helpful to understand where files are being stored. There are two main directories to be mindful of: the workspace directory and the home directory. The workspace directory is unique to each job while the home directory is the same for all jobs.

The command stores the JWT key file in the Jenkins workspace during the job. However, Salesforce CLI commands store authentication files in the home directory; these authentication files persist outside of the duration of the job.

This setup is not a problem when you run a single job but can cause problems when you run multiple jobs. So, what happens if you run multiple jobs using the same Dev Hub or other Salesforce user? When the CLI tries to connect to the Dev Hub as the user you authenticated, it fails to refresh the token. Why? The CLI tries to use a JWT key file that no longer exists in the other workspace, regardless of the for the current job.

If you set the home directory to match the workspace directory using , the authentication files are unique for each job. Creating unique auth files per job is also more secure because each job has access only to the auth files it creates.

When using , put all stages within its code block,

Authorize Your Dev Hub Org and Create a Scratch Org

This example uses two stages: one stage to authorize the Dev Hub org and another stage to create a scratch org.

Use to authorize your Dev Hub org.

You are required to run this step only once, but we suggest you add it to your Jenkinsfile and authorize each time you run the Jenkins job. This way you’re always sure that the Jenkins job is not aborted due to lack of authorization. There is typically little harm in authorizing multiple times, but keep in mind that the API call limit for your scratch org’s edition still applies.

Use the parameters of the command to provide information about the Dev Hub org that you’re authorizing. The values for the , , and parameters are the SF_CONSUMER_KEY, HubOrg, and SF_INSTANCE_URL environment variables you previously defined, respectively. The value of the parameter is the variable that you set in the previous section using the command. The parameter specifies that this HubOrg is the default Dev Hub org for creating scratch orgs.

Use the CLI command to create a scratch org. In the example, the CLI command uses the config/project-scratch-def.json file (relative to the project directory) to create the scratch org. The parameter specifies the output as JSON format. The parameter sets the new scratch org as the default.

The Groovy code that parses the JSON output of the command extracts the username that was auto-generated as part of the org creation. This username, stored in the SF_USERNAME variable, is used with the CLI commands that push source, assign a permission set, and so on.

Push Source and Assign a Permission Set

Let’s populate your new scratch org with metadata. This example uses the command to upload your source to the org. The source includes all the pieces that make up your Salesforce application: Apex classes and test classes, permission sets, layouts, triggers, custom objects, and so on.

Recall the SF_USERNAME variable that contains the auto-generated username that was output by the command in an earlier stage. The code uses this variable as the argument to the parameter to specify the username for the new scratch org.

The command pushes all the Salesforce-related files that it finds in your project. Add a .forceignore file to your repository to list the files that you don’t want pushed to the org.

Run Apex Tests

Now that your source code and test source are pushed to the scratch org, run the command to run Apex tests.

You can specify various parameters to the CLI command. In the example:

  • The option runs all tests in the scratch org, except tests that originate from installed managed packages. You can also specify to run only local tests, to run only certain Apex tests or suites or to run all tests in the org.
  • The tap option specifies that the command output is in Test Anything Protocol (TAP) format. The test results that are written to a file are still in JUnit and JSON formats.
  • The ciorg option specifies the username for accessing the scratch org (the value in SF_USERNAME).

The command writes its test results in JUnit format.

Delete the Scratch Org

Salesforce reserves the right to delete a scratch org a specified number of days after it was created. You can also create a stage in your pipeline that uses to explicitly delete your scratch org when the tests complete. This cleanup ensures better management of your resources.

Sours: https://developer.salesforce.com/docs/atlas.en-us.sfdx_dev.meta/sfdx_dev/sfdx_dev_ci_jenkins_sample_walkthrough.htm
  1. Mgs forum
  2. Goddess victoria
  3. Hyrule ridge

Think. Build. Salesforce Solutions.

For administrators working with a single company, changesets are a great way to move updates between sandboxes and a parent org. However, when working on the enterprise tier, we often need to transfer components between various test and developer environments, which are outside the scope of changesets.

To support multiple developers working across multiple environments on various features and fixes, a proven path is to use an automated continuous integration system to simplify deployments and promote transparency.
Salesforce Jenkins

 

Continuous integration

The concept behind a continuous integration, or CI, tool is that a constant flow of development changes and unit testing will be done to detect conflicts and errors within the development cycle itself. It allows you to perform a baseline of quality assurance without much reliance on any manual processes (at least until you get that notification that your code just broke the build). The flow is something like:

Prerequisite Software

  1. Salesforce instance Source and destination cloud
  2. Eclipse with force.com IDE: This used to pull the desired code and components to be deployed at other Salesforce.com instances.
  3. GitHub repository as source control
  4. Jenkins: It controls the deployment
  5. JDK
  6. JRE
  7. Force.com Migration tool
  8. Apache Ant

Installation

1. Install ANT

1) Download and install JRE 6: Verify the correct installation by executing the following command: java version.
2)  Download and Install Ant (version or higher) zip file from http://ant.apache.org/bindownload.cgi and set environment variable.
3) Download Force.com Migration Tool zip folder from Salesforce Org for which the path is as follows: Develop->Tools->Force.com Migration Tools.
4) Copy the &#;ant-salesforce.jar&#; file from the above extracted folder and paste it in the lib directory of the installed Ant folder.
5)  Set the environment Variables as follows:
->System Variables->new> Variable Name: ANT_HOME
Variable Value: C:\Program Files\apache-ant-x.xx.x
->System Variables->new> Variable Name: JAVA_HOME
Variable Value: C:\Program Files\Java\jdk
->System Variables->edit->PATH ->In PATH, add %JAVA_HOME%\bin and %ANT_HOME%\bin&#;.. SYSTEMROOT%\System32\WindowsPowerShell\v\;&#;. ;%JAVA_HOME%\bin;%ANT_HOME%\bin; &#;..
6)Verify if &#;C:\Program Files\Java\jdk_x.xx.x\lib&#; contains &#;Tools.jar&#; file.
7)Run the following command in cmd to verify the correct installation of ANT: ant -version
Update
sf.src = ../SFDC_CI_DEMO/src:
8) Update src path in build.xml and build.properties files.
Example: Please note that SFDC_CI_DEMO” is my project name created in eclipse.
In build.xml set basedir=&#;../SFDC_CI_DEMO/src&#; path at first line of code.
In build.properties set sf.src = ../SFDC_CI_DEMO/src” path at last line of code.
9) Install Eclipse with Force.com and Egit IDE (EGit IDE comes with eclipse. Check if it their else install GIT from Help -> Install new software

2.
Install GIT 

3. 
Install GITHUB (GitHubSetup.exe) “ This is required to install because if you are using public GitHub repository then you will have you use this to upload build.xml and build.properties files on cloud.
Note: Identify your GitHub repository and keep credentials and repository path handy. It will be required while pushing code from Eclipse to GitHub.

4.
Install Salesforce.com ANT Migration tool

5. Install Jenkins
You need to start Jenkins engine if you&#;ve installed using the .war file. To do this, open command prompt and navigate to the folder where &#;jenkins.war&#; file is located. Run the following command:
C:/&#;>java -jar jenkins.war

If you&#;ve installed using the windows installer, there&#;s no need to start Jenkins engine.

To run Jenkins, open your browser and connect to &#;localhost&#; Once you see the message “Jenkins is fully up and running means you are ready to go on.
Now open the browser and enter http://localhost/ . Configure Jenkins and configure Job to start using it.

Configuration

Configure Jenkins:

Once your Jenkins is up, goto jenkins > manage jenkins > configure system.
Update the JDK and ANT sections with their respective version and path.
Install GIT plugin from Manage Jenkin -> Manage plugins-> Available & choose GIT plugin to install.

Configure Job:

Go to http://localhost/ andClick on Create New item (using Build a free-style software project option). Enter Item name (This is your project name example SFDC_Deployment) and save.
Select GIT from “Source Code Management section.
Enter Repository URL (Examplehttps://github.com/ABCDEMO/ REPOSITORY1) and provide GITHUB repository password.
Goto &#;Build&#; section in the configuration page. Select Ant version.
In the Targets field, enter the library and property file path.

Eg:
-lib
&#;C:\Program Files (x86)\apache-ant\lib\ant-salesforce.jar&#;
-Propertyfile
&#;D:\Users\Username\.jenkins\jobs\JenkinsJobName\workspace\CI\build.properties&#;
Click on advanced button. In the &#;Build File&#; field, enter the the path Ant&#;s for build.xml

D:\Users\ Username \.jenkins\jobs\SFDC_Deployment\workspace\CI\build.xml

So here is the trick to get the best of out of these build.xml and build.properties files copied from Salesforce ANT tool installation Zip file. Update src path in build.xml and build.properties files.

Example: Please note that SFDC_CI_DEMO” is my project name created in eclipse.

In build.xml set basedir=&#;../SFDC_CI_DEMO/src&#; path at first line of code.
In build.properties  set .src = ../SFDC_CI_DEMO/src&#; path at last line of code.
After setting these paths, will upload these files to GitHub repository and then will sync it will local repository using GITHUB tool installed.
To do this you need to login into GITHUB local tool and navigate to your repository. Now go to settings and open in explorer. It open an explorer window. Create a folder named &#;CI&#; and then copy paste build.xml and build.properties updated files in this folder. Now check-in using GITHUB tool providing comment.

In the &#;Properties&#; field, enter the salesforce parameters used by Ant in build.xml

sf.username = [email protected]
sf.password = PasswordSecurityToken
sf.serverurl = login.salesforce.com
sf.checkOnly = false
sf.runAllTests = false
sf.logLevel = none
Click Save. Your deployment job is ready to go!

We have seen all the technical details of setting up Jenkins that includes configuration of Jenkins and other software requirements.

Let&#;s see how to process this take the advantage of Jenkins. As discussed in the beginning, it starts with fetching the code in Eclipse.

Let&#;s start using Eclipse to fetch the code and components locally and to select the right code and components for deployment.

Using Eclipse to push code to the repository

1. Add a force.com project by giving the correct URL and credentials and download the project data at your local PC.
2. Now right click on the project name -> team -> share project
3. Select GIT from the popup window and click next.
4. Select a path (folder) on your system and click create repository”. It will create a local copy of the repository. Now finish closing the popup. Now you are ready with a local copy.
5. Now it&#;s time to push your code and deploy sfdc components to the cloud repository. (STEP 2 as per the image above)
6. To push your code and components right click on the project name then Team -> Commit. Before starting the deployment, we should know what we are going to deploy. Let&#;s say we take Classes, VF pages, Custom objects, Related page layouts, Custom profile, Report and Reports Types, and Workflows.
So, select the appropriate files and components to push to the repository and specify comment and click “Commit and Push. It will ask the repository URL and authentication details to push the code at the correct location. URL should end with git so copy and paste proper URL from the repository. Example https://github.com/ABCDEMO/ REPOSITORY1.git

Note: ANT checks package.xml of the project src file to know what all things have to migrate. So on every change org check-in, this updated XML should be checked into the repository as well.

So now your selected code is pushed to repository and Jenkins is already configured. In Jenkins we have not given any time period to start the build process automatically, it will not start the build automatically. We have to build manually as per convenience. To build the build go to Jenkins&#;s job & click &#;Build Now&#; to run the deployment job.

To configure Jenkins auto-build the changes pushed in source control at any given time we can configure Jenkins by defining it at Build Triggers section in Poll SCM field. If we want that Jenkins should try to build after every 15 min., then we will write H/15 * * * *

Sours: https://www.mirketa.com/continuous-integration-jenkins/

sfdx-jenkins-org

For a fully guided walkthrough of setting up and configuring continuous integration using scratch orgs and Salesforce CLI, see the Continuous Integration Using Salesforce DX Trailhead module.

This repository shows how to successfully setup deploying to non-scratch orgs (sandbox or production) with Jenkins. We make a few assumptions in this README. Continue only if you have completed these critical configuration prerequisites.

Getting Started

  1. Fork this repo in to your GitHub account using the fork link at the top of the page.

  2. Clone your forked repo locally:

  3. Make sure that you have the Salesforce CLI installed. Run and confirm you see the command output. If you don't have it installed, you can download and install it from here.

  4. Set up a custom tool in Jenkins for Salesforce CLI. Name the custom tool and set its installation directory to the path where the executable is (for example, ).

  5. Set up a JWT-based auth flow for the target orgs that you want to deploy to. This step creates a file that is used in subsequent steps. (https://developer.salesforce.com/docs/atlas.en-us.sfdx_dev.meta/sfdx_dev/sfdx_dev_auth_jwt_flow.htm)

  6. Confirm that you can perform a JWT-based auth to the target orgs:

    Note: For more info on setting up JWT-based auth, see Authorize an Org Using the JWT-Based Flow in the Salesforce DX Developer Guide.

  7. From your JWT-based connected app on Salesforce, retrieve the generated .

  8. Set up Jenkins global environment variables for your Salesforce and . Note that this username is the username that you use to access your Salesforce org.

    Create an environment variable named .

    Create an environment variable named .

  9. Store the generated file as a Jenkins secret file using the Jenkins Admin Credentials interface. Make note of the new entry’s ID.

  10. Set up Jenkins global environment variable to store the ID of the secret file you created.

    Create an environment variable named .

  11. Create a Jenkins pipeline with the included in the root directory of the Git repository.

Now you're ready to go! When you commit and push a change, your change kicks off a Jenkins build.

Enjoy!

Contributing to the Repository

If you find any issues or opportunities for improving this repository, fix them! Feel free to contribute to this project by forking this repository and making changes to the content. Once you've made your changes, share them back with the community by sending a pull request. See How to create pull requests for more information about contributing to GitHub projects.

Reporting Issues

If you find any issues with this demo that you can't fix, feel free to report them in the issues section of this repository.

Sours: https://github.com/forcedotcom/sfdx-jenkins-org

Jenkins salesforce

Salesforce Deployments using Jenkins

27 Jul Salesforce Deployments using Jenkins

Posted by Chris Theodore Jayakumar

Salesforce is a CRM (Customer Relationship Management) platform that is used to record the core user journey of a customer when they purchase a service from a company. It has its own proprietary language for development called Apex. As a platform, it is capable of storing and processing large amounts of data. It also has different connectors which neatly integrate with other platforms.

I recently had the opportunity to create pipelines for deploying code to the Salesforce platform using Jenkins. This post describes the challenges I encountered implementing pipelines using a number of different techniques:

  1. Ant
  2. Ant with an improved Git strategy
  3. SFDX (Salesforce Developer Experience)

If you are looking for ways to orchestrate Salesforce deployments using Jenkins, then keep reading&#;you can thank me at the end of the article 🙂

Jenkins is the most popular CI tool in the market, and provides a wide variety of open-source plugins. Best of all, its support for the Groovy language means pretty much anything can be done in a script. Our first approach was an Ant project that took Salesforce Apex code, built it into a package, and deployed it to a Salesforce environment:

Note that the package deployment is done in two stages, which we will now drill into:

Stage 1

This stage validates the package against a given environment to ensure existing functionality isn’t broken. This is done by running all the unit tests in that environment without actually deploying the package. This is achieved by setting  to .

In our case, the team used project branches which were managed by the team lead. The developers would create feature branches for user story implementation, and merge them into the &#;central hub&#; integration branch once complete. Every day, the team lead would merge the integration branch into the project branch. Note that a local code merge was done to ensure that the feature branch code was always updated from the upstream integration branch. 

Stage 2

If stage 1 was successful, stage 2 then deployed the package to the environment:

This approach took around 20 minutes to validate and another 20 minutes to deploy. In addition, someone from the operations team needed to merge the pull request in order to trigger auto deployment via Jenkins.

The time taken to validate a package against an environment varies, depending on the size of your Salesforce Organisation. We found that the number of unit tests strongly influenced the time taken to deploy. By default, Salesforce needs 75% of unit tests to pass successfully under code validation. Whilst this is regarded as a best practice value, it can be modified if necessary.

So, in summary, this approach had a high cost in the form of branch maintenance and conflict resolution, as it required the Team Lead to merge the project branch every day, and the operations team to merge to trigger a deployment. It was also very slow.

The primary goal for our second approach was to allow developers to trigger deployment without the need of the operations team, but also to ensure deployment execution was still approved by privileged people (e.g. Team Leads). This was important because the developers worked offshore while the operations team worked onshore, so removing the merge dependency was a way to bridge the gap between different locations and timezones.

Incidentally, whilst trying this approach SonarQube also introduced support for Apex code analysis, so we thought we&#;d include that in addition to the traditional PMD code scan.

The key to this approach was that the pipeline behaviour was made conditional on the pull request title. If the title was &#;&#;, then the pipeline would deploy to that Salesforce environment. Otherwise the pipeline only validated code against the environment without deploying.

The process involved a developer raising a pull request to validate code against a Salesforce environment. On validation success, the pull request was updated with the build status. Based on the build status, a team lead then updated the title of the pull request to &#;&#;, which triggered Jenkins to rerun the pipeline and deploy to the Salesforce environment.

Note that for this approach, the project branches were removed and the integration branch was the only upstream branch across all the teams.

This pipeline process worked well, and the operations team was greatly freed up. However, we still hadn&#;t speed up the validation or deployment process. To make the process more efficient and decrease the execution time of the pipeline, we had try something completely different.

The Salesforce Developer Experience (SFDX) introduces a lot of new features, meaning that our code had to be migrated across to the new SFDX structure for it to work in the SFDX environment.

Most importantly, SFDX has its own CLI tool for deploying code or running validation against any environment. It also supports the creation of Scratch Orgs, an environment to be created on-the-fly for code validation, but without having to retain persistent environments.

For this approach, the branching strategy was similar to what we had used previously, except that the code was deployed using a CLI command:

//Deploy to SIT with no Unit tests sfdx force:source:deploy -p deploypackage/force-app/main/default -w $SF_TIMEOUT -u $SF_USERNAME

Note that this command deployed code as independent files, instead creating a complete deployment package. This helped us to strip-down the deployment to only the files that were modified in a particular pull request (by using Git APIs). Note, however, that if a file was deleted as part of a pull request, then an additional command would need to be run to delete the corresponding package:

//Delete files using the delete command sfdx force:source:delete -p deletepackage\\force-app\\main\\default\\ -w 90 -u $SF_USERNAME -r

Note also that, to implement this, the Jenkins pipeline had to create two separate folders, and selectively place files in them according to the pull request.

This approach brought the overall duration of a typical deployment down to around 2 seconds (dependent on the number of files included in the pull request). This greatly decreased the time a developer had to spend waiting for feedback. The number of unit tests that were run also vastly decreased, as Salesforce only ran a given unit test if associated functionality had been changed.

For us, the key to a deployment pipeline that was both quick and required minimal manual intervention was our Git branching strategy, along with the use of more efficient features introduced in recent Salesforce releases. There are plenty of other tools for performing similar deployment strategies in Salesforce, but I believe that which you use is depends on how you use Salesforce, and how far you are willing to go to ensure code quality in the process of deployment.

  1. https://trailhead.salesforce.com/en/content/learn/trails/move-to-a-continuous-integration-development
  2. https://trailhead.salesforce.com/content/learn/modules/sfdx_app_dev
Sours: https://shinesolutions.com//07/27/salesforce-deployments-using-jenkins/

Go to the entrance for now and meet her. You recognize her immediately, right by her gait. Valeria Ignatenko.

You will also like:

Although outwardly - Valya, the crew in the command room, - in a moment everyone stood in the ranks, with interest waiting for new adventures. You have two. Options: this minute we disband the crew and you return to your units. And the second option is that you remain under my command for another day.



11496 11497 11498 11499 11500