How to build Graal-enabled JDK8 on CircleCI?

Citation: feature image on the blog can be found on flickr and created by Luca Galli. The image in one of the below sections can be also found on flickr and created by fklv (Obsolete hipster).


The GraalVM compiler is a replacement to HotSpot’s server-side JIT compiler widely known as the C2 compiler. It is written in Java with the goal of better performance (among other goals) as compared to the C2 compiler. New changes starting with Java 9 mean that we can now plug in our own hand-written C2 compiler into the JVM, thanks to JVMCI. The researchers and engineers at Oracle Labs) have created a variant of JDK8 with JVMCI enabled which can be used to build the GraalVM compiler. The GraalVM compiler is open source and is available on GitHub (along with the HotSpot JVMCI sources) needed to build the GraalVM compiler). This gives us the ability to fork/clone it and build our own version of the GraalVM compiler.

In this post, we are going to build the GraalVM compiler with JDK8 on CircleCI. The resulting artifacts are going to be:

– JDK8 embedded with the GraalVM compiler, and
– a zip archive containing Graal & Truffle modules/components.

Note: we are not covering how to build the whole of the GraalVM suite in this post, that can be done via another post. Although these scripts can be used to that, and there exists a branch which contains the rest of the steps.

Why use a CI tool to build the GraalVM compiler?

Screenshot_2019-08-06 Graal lovely

Continuous integration (CI) and continuous deployment (CD) tools have many benefits. One of the greatest is the ability to check the health of the code-base. Seeing why your builds are failing provides you with an opportunity to make a fix faster. For this project, it is important that we are able to verify and validate the scripts required to build the GraalVM compiler for Linux and macOS, both locally and in a Docker container.

A CI/CD tool lets us add automated tests to ensure that we get the desired outcome from our scripts when every PR is merged. In addition to ensuring that our new code does not introduce a breaking change, another great feature of CI/CD tools is that we can automate the creation of binaries and the automatic deployment of those binaries, making them available for open source distribution.

Let’s get started

During the process of researching CircleCI as a CI/CD solution to build the GraalVM compiler, I learned that we could run builds via two different approaches, namely:

– A CircleCI build with a standard Docker container (longer build time, longer config script)
– A CircleCI build with a pre-built, optimised Docker container (shorter build time, shorter config script)

We will now go through the two approaches mentioned above and see the pros and cons of both of them.

Approach 1: using a standard Docker container

For this approach, CircleCI requires a docker image that is available in Docker Hub or another public/private registry it has access to. We will have to install the necessary dependencies in this available environment in order for a successful build. We expect the build to run longer the first time and, depending on the levels of caching, it will speed up.

To understand how this is done, we will be going through the CircleCI configuration file section-by-section (stored in .circleci/circle.yml), see config.yml in .circleci for the full listing, see commit df28ee7 for the source changes.

Explaining sections of the config file

The below lines in the configuration file will ensure that our installed applications are cached (referring to the two specific directories) so that we don’t have to reinstall the dependencies each time a build occurs:

    dependencies:
      cache_directories:
        - "vendor/apt"
        - "vendor/apt/archives"

We will be referring to the docker image by its full name (as available on http://hub.docker.com under the account name used – adoptopenjdk). In this case, it is a standard docker image containing JDK8 made available by the good folks behind the Adopt OpenJDK build farm. In theory, we can use any image as long as it supports the build process. It will act as the base layer on which we will install the necessary dependencies:

        docker:
          - image: adoptopenjdk/openjdk8:jdk8u152-b16

Next, in the pre-Install Os dependencies step, we will restore the cache, if it already exists, this may look a bit odd, but for unique key labels, the below implementation is recommended by the docs):

          - restore_cache:
              keys:
                - os-deps-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
                - os-deps-{{ arch }}-{{ .Branch }}

Then, in the Install Os dependencies step we run the respective shell script to install the dependencies needed. We have set this step to timeout if the operation takes longer than 2 minutes to complete (see docs for timeout):

          - run:
              name: Install Os dependencies
              command: ./build/x86_64/linux_macos/osDependencies.sh
              timeout: 2m

Then, in then post-Install Os dependencies step, we save the results of the previous step – the layer from the above run step (the key name is formatted to ensure uniqueness, and the specific paths to save are included):

          - save_cache:
              key: os-deps-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
              paths:
                - vendor/apt
                - vendor/apt/archives

Then, in the pre-Build and install make via script step, we restore the cache, if one already exists:

          - restore_cache:
              keys:
                - make-382-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
                - make-382-{{ arch }}-{{ .Branch }}

Then, in the Build and install make via script step we run the shell script to install a specific version of make and it is set to timeout if step takes longer than 1 minute to finish:

          - run:
              name: Build and install make via script
              command: ./build/x86_64/linux_macos/installMake.sh
              timeout: 1m

Then, in the post Build and install make via script step, we save the results of the above action to the cache:

          - save_cache:
              key: make-382-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
              paths:
                - /make-3.82/
                - /usr/bin/make
                - /usr/local/bin/make
                - /usr/share/man/man1/make.1.gz
                - /lib/

Then, we define environment variables to update JAVA_HOME and PATH at runtime. Here the environment variables are sourced so that we remember them for the next subsequent steps till the end of the build process (please keep this in mind):

          - run:
              name: Define Environment Variables and update JAVA_HOME and PATH at Runtime
              command: |
                echo '....'     <== a number of echo-es displaying env variable values
                source ${BASH_ENV}

Then, in the step to Display Hardware, Software, Runtime environment and dependency versions, as best practice we display environment-specific information and record it into the logs for posterity (also useful during debugging when things go wrong):

          - run:
              name: Display HW, SW, Runtime env. info and versions of dependencies
              command: ./build/x86_64/linux_macos/lib/displayDependencyVersion.sh

Then, we run the step to setup MX – this is important from the point of view of the GraalVM compiler (mx) is a specialised build system created to facilitate compiling and building Graal/GraalVM and  components):

          - run:
              name: Setup MX
              command: ./build/x86_64/linux_macos/lib/setupMX.sh ${BASEDIR}

Then, we run the important step to Build JDK JVMCI (we build the JDK with JVMCI enabled here) and timeout, if the process takes longer than 15 minutes without any output or if the process takes longer than 20 minutes in total to finish:

          - run:
              name: Build JDK JVMCI
              command: ./build/x86_64/linux_macos/lib/build_JDK_JVMCI.sh ${BASEDIR} ${MX}
              timeout: 20m
              no_output_timeout: 15m

Then, we run the step Run JDK JVMCI Tests, which runs tests as part of the sanity check after building the JDK JVMCI:

          - run:
              name: Run JDK JVMCI Tests
              command: ./build/x86_64/linux_macos/lib/run_JDK_JVMCI_Tests.sh ${BASEDIR} ${MX}

Then, we run the step Setting up environment and Build GraalVM Compiler, to set up the build environment with the necessary environment variables which will be used by the steps to follow:

          - run:
              name: Setting up environment and Build GraalVM Compiler
              command: |
                echo ">>>> Currently JAVA_HOME=${JAVA_HOME}"
                JDK8_JVMCI_HOME="$(cd ${BASEDIR}/graal-jvmci-8/ && ${MX} --java-home ${JAVA_HOME} jdkhome)"
                echo "export JVMCI_VERSION_CHECK='ignore'" >> ${BASH_ENV}
                echo "export JAVA_HOME=${JDK8_JVMCI_HOME}" >> ${BASH_ENV}
                source ${BASH_ENV}

Then, we run the step Build the GraalVM Compiler and embed it into the JDK (JDK8 with JVMCI enabled) which timeouts if the process takes longer than 7 minutes without any output or longer than 10 minutes in total to finish:

          - run:
              name: Build the GraalVM Compiler and embed it into the JDK (JDK8 with JVMCI enabled)
              command: |
                echo ">>>> Using JDK8_JVMCI_HOME as JAVA_HOME (${JAVA_HOME})"
                ./build/x86_64/linux_macos/lib/buildGraalCompiler.sh ${BASEDIR} ${MX} ${BUILD_ARTIFACTS_DIR}
              timeout: 10m
              no_output_timeout: 7m

Then, we run the simple sanity checks to verify the validity of the artifacts created once a build has been completed, just before archiving the artifacts:

          - run:
              name: Sanity check artifacts
              command: |
                ./build/x86_64/linux_macos/lib/sanityCheckArtifacts.sh ${BASEDIR} ${JDK_GRAAL_FOLDER_NAME}
              timeout: 3m
              no_output_timeout: 2m

Then, we run the step Archiving artifacts (means compressing and copying final artifacts into a separate folder) which timeouts if the process takes longer than 2 minutes without any output or longer than 3 minutes in total to finish:

          - run:
              name: Archiving artifacts
              command: |
                ./build/x86_64/linux_macos/lib/archivingArtifacts.sh ${BASEDIR} ${MX} ${JDK_GRAAL_FOLDER_NAME} ${BUILD_ARTIFACTS_DIR}
              timeout: 3m
              no_output_timeout: 2m

For posterity and debugging purposes, we capture the generated logs from the various folders and archive them:

          - run:
              name: Collecting and archiving logs (debug and error logs)
              command: |
                ./build/x86_64/linux_macos/lib/archivingLogs.sh ${BASEDIR}
              timeout: 3m
              no_output_timeout: 2m
              when: always
          - store_artifacts:
              name: Uploading logs
              path: logs/

Finally, we store the generated artifacts at a specified location – the below lines will make the location available on the CircleCI interface (we can download the artifacts from here):

          - store_artifacts:
              name: Uploading artifacts in jdk8-with-graal-local
              path: jdk8-with-graal-local/

Approach 2: using a pre-built optimised Docker container

For approach 2, we will be using a pre-built docker container, that has been created and built locally with all necessary dependencies, the docker image saved and then pushed to a remote registry for e.g. Docker Hub. And then we will be referencing this docker image in the CircleCI environment, via the configuration file. This saves us time and effort for running all the commands to install the necessary dependencies to create the necessary environment for this approach (see the details steps in Approach 1 section).

We expect the build to run for a shorter time as compared to the previous build and this speedup is a result of the pre-built docker image (we will see in the Steps to build the pre-built docker image section), to see how this is done). The additional speed benefit comes from the fact that CircleCI caches the docker image layers which in turn results in a quicker startup of the build environment.

We will be going through the CircleCI configuration file section-by-section (stored in .circleci/circle.yml) for this approach, see config.yml in .circleci for the full listing, see commit e5916f1 for the source changes.

Explaining sections of the config file

Here again, we will be referring to the docker image by it’s full name. It is a pre-built docker image neomatrix369/graalvm-suite-jdk8 made available by neomatrix369. It was built and uploaded to Docker Hub in advance before the CircleCI build was started. It contains the necessary dependencies for the GraalVM compiler to be built:

        docker:
          - image: neomatrix369/graal-jdk8:${IMAGE_VERSION:-python-2.7}
        steps:
          - checkout

All the sections below do the exact same tasks (and for the same purpose) as in Approach 1, see Explaining sections of the config file section.

Except, we have removed the below sections as they are no longer required for Approach 2:

    - restore_cache:
              keys:
                - os-deps-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
                - os-deps-{{ arch }}-{{ .Branch }}
          - run:
              name: Install Os dependencies
              command: ./build/x86_64/linux_macos/osDependencies.sh
              timeout: 2m
          - save_cache:
              key: os-deps-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
              paths:
                - vendor/apt
                - vendor/apt/archives
          - restore_cache:
              keys:
                - make-382-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
                - make-382-{{ arch }}-{{ .Branch }}
          - run:
              name: Build and install make via script
              command: ./build/x86_64/linux_macos/installMake.sh
              timeout: 1m
          - save_cache:
              key: make-382-{{ arch }}-{{ .Branch }}-{{ .Environment.CIRCLE_SHA1 }}
              paths:
                - /make-3.82/
                - /usr/bin/make
                - /usr/local/bin/make
                - /usr/share/man/man1/make.1.gz

In the following section, I will go through the steps show how to build the pre-built docker image. It will involve running the bash scripts – ./build/x86_64/linux_macos/osDependencies.sh and ./build/x86_64/linux_macos/installMake.sh to install the necessary dependencies as part of building a docker image. And, finally pushing the image to Docker Hub (can be pushed to any other remote registry of your choice).

Steps to build the pre-built docker image

– Run build-docker-image.sh (see bash script source) which depends on the presence of Dockerfile (see docker script source). The Dockerfile does all the necessary tasks of running the dependencies inside the container i.e. runs the bash scripts ./build/x86_64/linux_macos/osDependencies.sh and ./build/x86_64/linux_macos/installMake.sh:

    $ ./build-docker-image.sh

– Once the image has been built successfully, run push-graal-docker-image-to-hub.sh after setting the USER_NAME and IMAGE_NAME (see source code) otherwise it will use the default values as set in the bash script:

    $ USER_NAME="[your docker hub username]" IMAGE_NAME="[any image name]" \
        ./push-graal-docker-image-to-hub.sh

CircleCI config file statistics: Approach 1 versus Approach 2

Areas of interestApproach 1Approach 2
Config file (full source list)build-on-circlecibuild-using-prebuilt-docker-image
Commit point (sha)df28ee7e5916f1
Lines of code (loc)110 lines85 lines
Source lines (sloc)110 sloc85 sloc
Steps (steps: section)1915
Performance (see Performance section)Some speedup due to caching, but slower than Approach 2Speed-up due to pre-built docker image, and also due to caching at different steps. Faster than Approach 1

Ensure DLC layering is enabled (its a paid feature)

What not to do?

Approach 1 issues

I came across things that wouldn’t work initially, but were later fixed with changes to the configuration file or the scripts:

  • please make sure the .circleci/config.yml is always in the root directory of the folder
  • when using the store_artifacts directive in the .circleci/config.yml file setting, set the value to a fixed folder name i.e. jdk8-with-graal-local/ – in our case, setting the path to ${BASEDIR}/project/jdk8-with-graal didn’t create the resulting artifact once the build was finished hence the fixed path name suggestion.
  • environment variables: when working with environment variables, keep in mind that each command runs in its own shell hence the values set to environment variables inside the shell execution environment isn’t visible outside, follow the method used in the context of this post. Set the environment variables such that all the commands can see its required value to avoid misbehaviours or unexpected results at the end of each step.
  • caching: use the caching functionality after reading about it, for more details on CircleCI caching refer to the caching docs. See how it has been implemented in the context of this post. This will help avoid confusions and also help make better use of the functionality provided by CircleCI.

Approach 2 issues

  • Caching: check the docs when trying to use the Docker Layer Caching (DLC) option as it is a paid feature, once this is known the doubts about “why CircleCI keeps downloading all the layers during each build” will be clarified, for Docker Layer Caching details refer to docs. It can also clarify why in non-paid mode my build is still not as fast as I would like it to be.

General note:

  • Light-weight instances: to avoid the pitfall of thinking we can run heavy-duty builds, check the documentation on the technical specifications of the instances. If we run the standard Linux commands to probe the technical specifications of the instance we may be misled by thinking that they are high specification machines. See the step that enlists the Hardware and Software details of the instance (see Display HW, SW, Runtime env. info and versions of dependencies section). The instances are actually Virtual Machines or Container like environments with resources like 2CPU/4096MB. This means we can’t run long-running or heavy-duty builds like building the GraalVM suite. Maybe there is another way to handle these kinds of builds, or maybe such builds need to be decomposed into smaller parts.
  • Global environment variables: as each run line in the config.yml, runs in its own shell context, from within that context environment variables set by other executing contexts do not have access to these values. Hence in order to overcome this, we have adopted two methods:
  • pass as variables as parameters to calling bash/shell scripts to ensure scripts are able to access the values in the environment variables
  • use the source command as a run step to make environment variables accessible globally

End result and summary

We see the below screen (the last step i.e. Updating artifacts enlists where the artifacts have been copied), after a build has been successfully finished:

The artifacts are now placed in the right folder for download. We are mainly concerned about the jdk8-with-graal.tar.gz artifact.

Performance

Before writing this post, I ran multiple passes of both the approaches and jotted down the time taken to finish the builds, which can be seen below:

Approach 1: standard CircleCI build (caching enabled)
– 13 mins 28 secs
– 13 mins 59 secs
– 14 mins 52 secs
– 10 mins 38 secs
– 10 mins 26 secs
– 10 mins 23 secs
Approach 2: using pre-built docker image (caching enabled, DLC) feature unavailable)
– 13 mins 15 secs
– 15 mins 16 secs
– 15 mins 29 secs
– 15 mins 58 secs
– 10 mins 20 secs
– 9 mins 49 secs

Note: Approach 2 should show better performance when using a paid tier, as Docker Layer Caching) is available as part of this plan.

Sanity check

In order to be sure that by using both the above approaches we have actually built a valid JDK embedded with the GraalVM compiler, we perform the following steps with the created artifact:

– Firstly, download the jdk8-with-graal.tar.gz artifact from under the Artifacts tab on the CircleCI dashboard (needs sign-in):

– Then, unzip the .tar.gz file and do the following:

    tar xvf jdk8-with-graal.tar.gz

– Thereafter, run the below command to check the JDK binary is valid:

    cd jdk8-with-graal
    ./bin/java -version

– And finally check if we get the below output:

    openjdk version "1.8.0-internal"
    OpenJDK Runtime Environment (build 1.8.0-internal-jenkins_2017_07_27_20_16-b00)
    OpenJDK 64-Bit Graal:compiler_ab426fd70e30026d6988d512d5afcd3cc29cd565:compiler_ab426fd70e30026d6988d512d5afcd3cc29cd565 (build 25.71-b01-internal-jvmci-0.46, mixed mode)

– Similarly, to confirm if the JRE is valid and has the GraalVM compiler built in, we do this:

    ./bin/jre/java -version

– And check if we get a similar output as above:

    openjdk version "1.8.0-internal"
    OpenJDK Runtime Environment (build 1.8.0-internal-jenkins_2017_07_27_20_16-b00)
    OpenJDK 64-Bit Graal:compiler_ab426fd70e30026d6988d512d5afcd3cc29cd565:compiler_ab426fd70e30026d6988d512d5afcd3cc29cd565 (build 25.71-b01-internal-jvmci-0.46, mixed mode)

With this, we have successfully built JDK8 with the GraalVM compiler embedded in it and also bundled the Graal and Truffle components in an archive file, both of which are available for download via the CircleCI interface.

Note: you will notice that we do perform sanity checks of the binaries built just before we pack them into compressed archives, as part of the build steps (see bottom section of CircleCI the configuration files section).

Nice badges!

We all like to show-off and also like to know the current status of our build jobs. A green-colour, build status icon is a nice indication of success, which looks like the below on a markdown README page:

We can very easily embed both of these status badges displaying the build status of our project (branch-specific i.e. master or another branch you have created) built on CircleCI (see docs) on how to do that).

Conclusions

We explored two approaches to build the GraalVM compiler using the CircleCI environment. They were good experiments to compare performance between the two approaches and also how we can do them with ease. We also saw a number of things to avoid or not to do and also saw how useful some of the CircleCI features are. The documentation and forums do good justice when trying to make a build work or if you get stuck with something.

Once we know the CircleCI environment, it’s pretty easy to use and always gives us the exact same response (consistent behaviour) every time we run it. Its ephemeral nature means we are guaranteed a clean environment before each run and a clean up after it finishes. We can also set up checks on build time for every step of the build, and abort a build if the time taken to finish a step surpasses the threshold time-period.

The ability to use pre-built docker images coupled with Docker Layer Caching on CircleCI can be a major performance boost (saves us build time needed to reinstall any necessary dependencies at every build). Additional performance speedups are available on CircleCI, with caching of the build steps – this again saves build time by not having to re-run the same steps if they haven’t changed.

There are a lot of useful features available on CircleCI with plenty of documentation and everyone on the community forum are helpful and questions are answered pretty much instantly.

Next, let’s build the same and more on another build environment/build farm – hint, hint, are you think the same as me? Adopt OpenJDK build farm)? We can give it a try!

Thanks and credits to Ron Powell from CircleCI and Oleg Šelajev from Oracle Labs for proof-reading and giving constructive feedback. 

Please do let me know if this is helpful by dropping a line in the comments below or by tweeting at @theNeomatrix369, and I would also welcome feedback, see how you can reach me, above all please check out the links mentioned above.

Useful resources

– Links to useful CircleCI docs
About Getting started | Videos
About Docker
Docker Layer Caching
About Caching
About Debugging via SSH
CircleCI cheatsheet
CircleCI Community (Discussions)
Latest community topics
– CircleCI configuration and supporting files
Approach 1: https://github.com/neomatrix369/awesome-graal/tree/build-on-circleci (config file and other supporting files i.e. scripts, directory layout, etc…)
Approach 2: https://github.com/neomatrix369/awesome-graal/tree/build-on-circleci-using-pre-built-docker-container (config file and other supporting files i.e. scripts, directory layout, etc…)
Scripts to build Graal on Linux, macOS and inside the Docker container
Truffle served in a Holy Graal: Graal and Truffle for polyglot language interpretation on the JVM
Learning to use Wholly GraalVM!
Building Wholly Graal with Truffle!

How is Java / JVM built ? Adopt OpenJDK is your answer!

Introduction & history
As some of you may already know, starting with Java 7, OpenJDK is the Reference Implementation (RI) to Java. The below time line gives you an idea about the history of OpenJDK:
OpenJDK history (2006 till date)
If you have wondered about the JDK or JRE binaries that you download from vendors like Oracle, Red Hat, etcetera, then the clue is that these all stem from OpenJDK. Each vendor then adds some extra artefacts that are not open source yet due to security, proprietary or other reasons.


What is OpenJDK made of ?
OpenJDK is made up of a number of repositories, namely corba, hotspot, jaxp, jaxws, jdk, langtools, and nashorn. Between OpenjJDK8 and OpenJDK9 there have been no new repositories introduced, but lots of new changes and restructuring, primarily due to Jigsaw – the modularisation of Java itself [2] [3] [4] [5].
repo composition, language breakdown (metrics are estimated)
Recent history
OpenJDK Build Benchmarks – build-infra (Nov 2011) by Fredrik Öhrström, ex-Oracle, OpenJDK hero!

Fredrik Öhrström visited the LJC [16] in November 2011 where he showed us how to build OpenJDK on the three major platforms, and also distributed a four page leaflet with the benchmarks of the various components and how long they took to build. The new build system and the new makefiles are a result of the build system being re-written (build-infra).


Below are screen-shots of the leaflets, a good reference to compare our journey:

How has Java the language and platform built over the years ?

Java is built by bootstrapping an older (previous) version of Java – i.e. Java is built using Java itself as its building block. Where older components are put together to create a new component which in the next phase becomes the building block. A good example of bootstrapping can be found at Scheme from Scratch [6] or even on Wikipedia [7].


OpenJDK8 [8] is compiled and built using JDK7, similarly OpenJDK9 [9] is compiled and built using JDK8. In theory OpenJDK8 can be compiled using the images created from OpenJDK8, similarly for OpenJDK9 using OpenJDK9. Using a process called bootcycle images – a JDK image of OpenJDK is created and then using the same image, OpenJDK is compiled again, which can be accomplished using a make command option:
$ make bootcycle-images       # Build images twice, second time with newly built JDK

make offers a number of options under OpenJDK8 and OpenJDK9, you can build individual components or modules by naming them, i.e.

$ make [component-name] | [module-name]
or even run multiple build processes in parallel, i.e.
$ make JOBS=<n>                 # Run <n> parallel make jobs
Finally install the built artefact using the install option, i.e.
$ make install


Some myths busted
OpenJDK or Hotspot to be more specific isn’t completely written in C/C++, a good part of the code-base is good ‘ole Java (see the composition figure above). So you don’t have to be a hard-core developer to contribute to OpenJDK. Even the underlying C/C++ code code-base isn’t scary or daunting to look at. For example here is an extract of a code snippet from vm/memory/universe.cpp in the HotSpot repo –
.
.
.
Universe::initialize_heap()

if (UseParallelGC) {
#ifndef SERIALGC
Universe::_collectedHeap = new ParallelScavengeHeap();
#else // SERIALGC
fatal("UseParallelGC not supported in this VM.");
#endif // SERIALGC

} else if (UseG1GC) {
#ifndef SERIALGC
G1CollectorPolicy* g1p = new G1CollectorPolicy();
G1CollectedHeap* g1h = new G1CollectedHeap(g1p);
Universe::_collectedHeap = g1h;
#else // SERIALGC
fatal("UseG1GC not supported in java kernel vm.");
#endif // SERIALGC

} else {
GenCollectorPolicy* gc_policy;

if (UseSerialGC) {
gc_policy = new MarkSweepPolicy();
} else if (UseConcMarkSweepGC) {
#ifndef SERIALGC
if (UseAdaptiveSizePolicy) {
gc_policy = new ASConcurrentMarkSweepPolicy();
} else {
gc_policy = new ConcurrentMarkSweepPolicy();
}
#else // SERIALGC
fatal("UseConcMarkSweepGC not supported in this VM.");
#endif // SERIALGC
} else { // default old generation
gc_policy = new MarkSweepPolicy();
}

Universe::_collectedHeap = new GenCollectedHeap(gc_policy);
}
. . .
(please note that the above code snippet might have changed since published here)
The things that appears clear from the above code-block are, we are looking at how pre-compiler notations are used to create Hotspot code that supports a certain type of GC i.e. Serial GC or Parallel GC. Also the type of GC policy is selected in the above code-block when one or more GC switches are toggled i.e. UseAdaptiveSizePolicy when enabled selects the Asynchronous Concurrent Mark and Sweep policy. In case of either Use Serial GC or Use Concurrent Mark Sweep GC are not selected, then the GC policy selected is Mark and Sweep policy. All of this and more is pretty clearly readable and verbose, and not just nicely formatted code that reads like English.


Further commentary can be found in the section called Deep dive Hotspot stuff in the Adopt OpenJDK Intermediate & Advance experiences [11] document.


Steps to build your own JDK or JRE
Earlier we mentioned about JDK and JRE images – these are no longer only available to the big players in the Java world, you and I can build such images very easily. The steps for the process have been simplified, and for a quick start see the Adopt OpenJDK Getting Started Kit [12] and Adopt OpenJDK Intermediate & Advance experiences [11] documents. For detailed version of the same steps, please see the Adopt OpenJDK home page [13]. Basically building a JDK image from the OpenJDK code-base boils down to the below commands:


(setup steps have been made brief and some commands omitted, see links above for exact steps)
 $ hg clone http://hg.openjdk.java.net/jdk8/jdk8 jdk8  (a)...OpenJDK8
or
$ hg clone http://hg.openjdk.java.net/jdk9/jdk9 jdk9  (a)...OpenJDK9
$ ./get_source.sh                                     (b)
$ bash configure                                      (c)
$ make clean images                                   (d)
(setup steps have been made brief and some commands omitted, see links above for exact steps)


To explain what is happening at each of the steps above:
(a) We clone the openjdk mercurial repo just like we would using git clone ….
(b) Once we have step (a) completed, we change into the folder created, and run the get_source.sh command, which is equivalent to a git fetch or a git pull, since the step (a) only brings down base files and not all of the files and folders.
(c) Here we run a script that checks for and creates the configuration needed to do the compile and build process
(d) Once step (c) is success we perform a complete compile, build and create JDK and JRE images from the built artefacts


As you can see these are dead-easy steps to follow to build an artefact or JDK/JRE images [step (a) needs to be run only once].


Benefits
– contribute to the evolution and improvement of the Java the language & platform
– learn about the internals of the language and platform
– learn about the OS platform and other technologies whilst doing the above
– get involved in F/OSS projects
– stay on top the latest changes in the Java / JVM sphere
– knowledge and experience that helps professionally but also these are not readily available from other sources (i.e. books, training, work-experience, university courses, etcetera).
– advancement in career
– personal development (soft skills and networking)


Contribute
Join the Adopt OpenJDK [13] and Betterrev [15] projects and contribute by giving us feedback about everything Java including these projects. Join the Adoption Discuss mailing list [14] and other OpenJDK related mailing lists to start with, these will keep you updated with latest progress and changes to OpenJDK. Fork any of the projects you see and submit changes via pull-requests.


Thanks and support

Adopt OpenJDK [13] and umbrella projects have been supported and progressed with help of JCP [21], the Openjdk team [22], JUGs like London Java Community [16], SouJava [17] and other JUGs in Brazil, a number of JUGs in Europe i.e. BGJUG (Bulgarian JUG) [18], BeJUG (Belgium JUG) [19], Macedonian JUG [20], and a number of other small JUGs. We hope in the coming time more JUGs and individuals would get involved. If you or your JUG wish to participate please get in touch.

—-

Credits
Special thanks to +Martijn Verburg (incepted Adopt OpenJDK),+Richard Warburton, +Oleg Shelajev, +Mite Mitreski, +Kaushik Chaubal and +Julius G for helping improve the content and quality of this post, and sharing their OpenJDK experience with us.

—-

How to get started ?
Join the Adoption Discuss mailing list [14], go to the Adopt OpenJDK home page [13] to get started, followed by referring to the Adopt OpenJDK Getting Started Kit [12] and Adopt OpenJDK Intermediate & Advance experiences [11] documents.


Please share your comments here or tweet at @theNeomatrix369.

Resources
[17] SouJava

This post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on!

How to build OpenJDK projects in Eclipse (for Ubuntu 12.04 LTS)

The instructions below specify how you can build the following OpenJDK projects using the Eclipse IDE for both the new InfraBuild and the Old Build systems:

Swing
JMX
JConsole
Langtools
JAXP and JAXWS
Hotspot
CORBA
JDK — sub-projects – Old Build
and
JDK — sub-projects – Infrabuild

Below is a list of systems and versions under which the installations and configurations were performed to come up with these instructions:

Synaptic Package Manager 0.75.9

Eclipse Indigo 3.7.2
—Eclipse JDT (Java) 3.7.2
—Eclipse CDT (C/C++)
——6.0.0.dist (C/C++ Remote debug launcher)
——1.0.2.dist (CDT GCC Cross Compiler)
——7.0.0.dist (GDB Common)

Ubuntu 12.04 LTS
Ant 1.8.2
OpenJDK 8
Java/Javac 1.7

In order to start, you have to create a VM (or install in your native environment), get the OpenJDK source code, install a number of packages, and build the OpenJDK from command line, as it described in Adopt OpenJDK VM Build (Old Build and InfraBuild).


Installing Eclipse for Java via the Synaptic Package Manager

Install Synaptic Package Manager using Ubuntu Software Centre

Install Eclipse packages via the Synaptic Package Manager by searching for “Eclipse-jdt” (without quotes).
Select eclipse-jdt from the packages list and the rest of the dependencies will also get selected.

eclipse-jdt

Eclipse Java Development Tool (eclipse-jdt)

Once that is done, click on Apply.
A Summary window will appear, click on the Apply button again.
Then wait till the Downloading Package Files is finished.
Run Eclipse via the Dash Home

DO NOT USE THE “Check for Updates” from the Help menu in Eclipse, this could lead to problems with the Java perspective or render the Eclipse platform unusable (error message). Although you can use the “Install New Software” option from the Help menu – its best not to.

PLEASE USE THE “Synaptic Package Manager” to do any “updates / upgrades” to Eclipse packages and plugins.

You can use the below commands (optional) to update and upgrade packages:

sudo apt-get update
(password requested)

sudo apt-get upgrade

Also accept any available updates from the Ubuntu Software Centre. Once this is done, a reboot of the Ubuntu system will be required.

Run Eclipse

Check if the Java perspective is available (see under File > New > New Project menu options).

JRE Runtime Environment
To add the run-time environment into Eclipse do the following:
a) Go to Windows > Preferences > Java
b) Select Installed JRE
c) Add a JRE by clicking on the Add button
d) Browse to the /usr/lib/jvm/java-7-openjdk-amd64 folder and select it.

Select Installed JRE

Select Installed JRE

e) Click on Finish to add the new JRE to Eclipse’s list of JREs.

InstalledJRE

InstalledJRE for all build.xml scripts

Shutdown Eclipse once done.


Installing Eclipse C/C++ plug-in (Eclipse CDT) via the Synaptic Package Manager

Run the Synaptic Package Manager
Search for “eclipse-cdt” (without quotes) and click on Search

eclipse-cdt

Eclipse C/C++ Development Tool (eclipse-cdt)

A list of packages appear, select ‘eclipse-cdt’ from the list. The a list of dependencies appear, select them all. Select Mark for Installation.
Click on the Apply button on the ribbon panel followed by a click on the Apply button at the bottom right hand corner.

If the Synaptic Package Manager detects updates/upgrades and wishes to perform them, accept and allow the changes to be applied.

Click on Next and finally Finished.

Post Eclipse installations, Eclipse would need to be restarted. Post installation of updates/upgrades Ubuntu would need to be restarted.

Before building any packages through command-line or Eclipse run the below command to install the ‘g++4.6-multlib’ library:

sudo apt-get install g++-4.6-multilib

It ensures that command-line and Eclipse build actions go smoothly wherever the g++ library is needed.

Preparing projects, folders and files

Go to into the ~/sources/jdk8_tl/jdk/make/ folder and copy the netbeans folder and call it eclipse. Similarly go into the ~/sources/jdk8_tl/langtools/make/ folder and copy the netbeans folder and call it eclipse.

Then go to the individual folders and rename the nbproject to ecproject wherever applicable. Remove the contents of this folder.

Open ~/sources/jdk8_tl/jdk/make/eclipse/common/shared.xml in a text editor, go to lines 130 and 132 and replace 1.5 and 1.6 to ${javac.version.source} and ${javac.version.target}. This will eventually be replaced in each of the individual build.properties files for each of the packages as shown below.

Iterate through the below folders to look for the build.xml and build.properties files, if either of them do not exist, create the file:

~/sources/jdk8_tl/jaxp/
~/sources/jdk8_tl/jaxws
~/sources/jdk8_tl/jdk/make/eclipse/j2se/
~/sources/jdk8_tl/jdk/make/eclipse/jconsole/
~/sources/jdk8_tl/jdk/make/eclipse/jmx/
~/sources/jdk8_tl/jdk/make/eclipse/swing/
~/sources/jdk8_tl/langtools/make/eclipse/langtools/

Ensure the build.xml scripts have Ant tasks in each of them.

Otherwise create them taking the build script at ~/sources/jdk8_tl/jdk/make/eclipse/swing/build.xml as an example. For one of the packages the Ant Task does exist but it is surrounded by comments, and would require uncommenting.

Look for the ‘javac’ tag in this file, a typical Ant task looks like the below:

    <target name="run" depends="-init">
        <property name="jvm.args" value=""/>
        <javac srcdir="${swing.demo.src}" destdir="${swing.demo.classes}"
            fork="true" failonerror="true"
            classpath="${dist.dir}/lib/swing.jar:${classes.dir}"
            debug="${javac.debug}" debuglevel="${javac.debuglevel}">
            <compilerarg line="${javac.options}"/>
        </javac>
        <java classname="SampleTree" classpath="${swing.demo.classes}"
            fork="true" failonerror="true"
            jvm="${bootstrap.jdk}/bin/java">
            <jvmarg line="${demo.bootclasspath}"/>
            <arg line="${jvm.args}"/>
        </java>
    </target>

Search and replace all references to the texts “netbeans” and “nbproject” in the above files, to “eclipse” and “ecproject” respectively (without the quotes).

Amend the build.xml scripts to include source files, output and distribution folders (optional).

Also insert the following line at the top of the build.xml file just under the tag for all the above build.xml files:

  <property name="build.properties" location="./build.properties"/>

After importing the above packages, change the JRE version of the build.xml for each of the packages from 1.6 to 1.7 by using the right-mouse-click,

Ant-EditConfiguration (1.6 to 1.7)

Change JRE from 1.6 to 1.7 for all build scripts.

select the menu option Run As > Ant Build…, select the JRE tab, select java-7-openjdk-amd64 from the list next to Separate JRE.

Ant-SelectingAntBuild

Starting the build process by running the Ant Build script.

From within all the build.xml files remove references to the -demo-init, shared-clean, etc… defined with the depends attribute in the tags. Alternatively create dummy ones in each of the script where required in order to validate them sucessfully.

Amend the Ant Task attached to the build.xml by opening the Ant properties page and checking the ‘clean’ task and reording the build and clean taks in the following order, by using the Order button:

clean or shared-clean
build

Now individually import these projects into Eclipse (after ensuring both Java and C/C++ projects can be loaded into Eclipse) after making amendments to the build.properties file for the relevant projects.

NewProject (Import project from Ant Build script)

NewProject (Import project from Ant Build script)

NewJavaProject (next screen)

NewJavaProject (next screen)

SelectBuildXML

Select the build.xml file to import the project.

a) Swing
Before importing any project make the following amendments

Insert the below lines to the ~/sources/jdk8_tl/jdk/make/eclipse/swing/build.properties file:
Old_Build

#PATH to JDK8
bootstrap.jdk=/home/openjdk/sources/jdk8_tl/build/linux-amd64_backup/j2sdk-image
basedir=~/sources/jdk8_tl/jdk/make/eclipse/swing/
javac.version.source=1.5
javac.version.target=1.7

InfraBuild

#PATH to JDK8
bootstrap.jdk=/home/openjdk/sources/jdk8_tl/build/linux-x64-normal-server-release/images/j2sdk-image
basedir=~/sources/jdk8_tl/jdk/make/eclipse/swing/
javac.version.source=1.5
javac.version.target=1.7

If file already exists check if the contents have the above lines, if not add these lines to the bottom of the build.properties file.

Run Eclipse and do the below:

Import a project using the Import a project from existing Ant BuildFile option from New > Project > Other, click on Browse, go to the ~/sources/jdk8_tl/jdk/make/eclipse/swing folder and select build.xml.

Once the build file is loaded, select the ‘Link to buildfile in the file system’ option. The project gets loaded as expected and gets the name defined in the script.

Open the build.xml properties by selecting the file, and selecting the Run As > Build Ant… option from the menu. Set the Target to “shared-clean, build” by selecting shared-clean from the list of targets and changing the order using the Order button.

Run the Build of Swing project by selecting the build.xml file and right mouse click on the Run as > Ant build option.

Observe the messages in the Console tab in Eclipse and wait for sometime till the build is complete (depends on performance of your computer).

See How to build the Swing project for a more detailed approach.

b) JMX
Perform the same step as for Swing using the respective files in the ~/sources/jdk8_tl/jdk/make/eclipse/jmx/ folder. Also amend the build.properties file in the same manner.

Run the Build of JMX project by selecting the build.xml file and right mouse click on the Run as > Ant build option.

Observe the messages in the Console tab in Eclipse and wait for sometime till the build is complete (depends on performance of your computer).

See How to build the JMX project for a more detailed approach.

c) JConsole
Perform the same step as for Swing using the respective files in the ~/sources/jdk8_tl/jdk/make/eclipse/jconsole/ folder.

Amend the build.properties file a bit diferent, insert the below lines instead:

Old Build

#PATH to JDK8
bootstrap.jdk=/home/openjdk/sources/jdk8_tl/build/linux-amd64_backup/j2sdk-image
basedir=~/sources/jdk8_tl/jdk/make/eclipse/jconsole/
javac.version.source=1.7
javac.version.target=1.7

InfraBuild

#PATH to JDK8
bootstrap.jdk=/home/openjdk/sources/jdk8_tl/build/linux-x64-normal-server-release/images/j2sdk-image
basedir=~/sources/jdk8_tl/jdk/make/eclipse/jconsole/
javac.version.source=1.7
javac.version.target=1.7

Run the Build of JConsole project by selecting the build.xml file and right mouse click on the Run as > Ant build option.

Observe the messages in the Console tab in Eclipse and wait for sometime till the build is complete (depends on performance of your computer).

See How to build the JConsole project for a more detailed approach.

d) Langtools

Edit the build.properties for langtools located under ~/sources/jdk8_tl/langtools,and amend the following bits in the boot.java.home section:

boot.java.home = /usr/lib/jvm/java-7-openjdk-amd64
boot.javac.source = 7
boot.javac.target = 7

Copy the build.properties file from the ~/sources/jdk8_tl/langtools/make/ folder to the ~/sources/jdk8_tl/langtools/make/eclipse/langtools folder.

In addition for langtools, change the following two tags at the top of the file from

<project name="langtools-netbeans" default="build" basedir="../../..">
       <property name="langtools.properties" location="make/netbeans/langtools/nbproject/private/langtools.properties"/>

to

<project name="langtools-eclipse" default="build" basedir="../../..">
       <property name="langtools.properties" location="make/eclipse/langtools/ecproject/private/langtools.properties"/>

Ensure the build.properties in the /home/openjdk/sources/jdk8_tl/langtools/make/ folder has the below line in it otherwise add/update it:

boot.java.home = usr/lib/jvm/java-7-openjdk-amd64

Build the project via the right-mouse click Run As > Ant Build option and wait for it to finish.

See How to build the Langtools project for a more detailed approach.

e) JAXP and JAXWS

For both these packages, edit the build.properties for each of them located under ~/sources/jdk8_tl/, add the below line to it:

bootstrap.dir=/usr/lib/jvm/java-7-openjdk-amd64

Perform the same tasks as for the Swing package with these two packages and watch for it to build.

Build the project via the right-mouse click Run As > Ant Build option and wait for it to finish.

See How to build the JAXP and JAXWS projects for a more detailed approach.

f) Hotspot

Building Hotspot via the command-line interface (Old build)

Perform the below instructions to build Hotspot using the old build system (ensure environment variables are set – see the sub-heading Old Build. And the ~/.bashrc file source-ed before proceeding):

source ~/.bashrc

cd ~/sources/jdk8_tl/hotspot/make
make clean
make all
-or-
make clean all

(the first make just does a clean build, the next one does a full or incremental build, whilst the last one always does a clean full build)

A successful build will result in the following build messages that can be found in the log file (https://github.com/neomatrix369/BuildHelpers/blob/master/EclipseProjectsForOpenJDK/Logs/hotspot/hotspotBuild.log). An incremental (another attempt) build results in the following log messages (https://github.com/neomatrix369/BuildHelpers/blob/master/EclipseProjectsForOpenJDK/Logs/hotspot/hotspotBuild2ndPass.log).

Building Hotspot via the command-line interface (InfraBuild)

Configuration
We do not need to set the ALT_BOOTDIR or ALT_JDK_IMPORT_PATH environment variables, instead use the –with-jdk=/usr/lib/jvm/java-7-openjdk-amd64 parameter when calling the configure bashscript, for e.g.

bash ../autoconf/configure –with-boot-jdk=/usr/lib/jvm/java-7-openjdk-amd64/

run the above command from within the ~/sources/jdk8_tl/common/makefiles folder. The jdk or jre path specified with the –with-jdk parameter may vary and is specific to your linux environment.

A successful run results in the output of the following log messages (https://github.com/neomatrix369/BuildHelpers/blob/master/EclipseProjectsForOpenJDK/Logs/configureInfraBuild.log).

Building
Perform the below instructions to build Hotspot using the Infrabuild system (ensure the necessary enviroment variables are set and the ALT_… environment variables are unset – see the sub-heading InfraBuild. And the ~/.bashrc file source-ed before proceeding)

source ~/.bashrc
make clean hotspot NEW_BUILD=true &> hotspotInfraBuild.log

A successful run results in the output of the following log messages (https://github.com/neomatrix369/BuildHelpers/blob/master/EclipseProjectsForOpenJDK/Logs/hotspot/hotspotInfraBuild.log). An incremental (another attempt) build results in the following log messages (https://github.com/neomatrix369/BuildHelpers/blob/master/EclipseProjectsForOpenJDK/Logs/hotspot/hotspotInfraBuild2ndPass.log).

Perform the configuration settings in Eclipse for Hotspot using the instructions at Eclipse C/C++ (CDT) – Hacking Hotspot in Eclipse and the Adding include paths to your C/C++ project instructions sets.

Ensure the below environment variables are added to the project:
Old Build

LANG=C
ALT_BOOTDIR=/usr/lib/jvm/java-7-openjdk-amd64
ALT_JDK_IMPORT_PATH=/usr/lib/jvm/java-7-openjdk-amd64
PRODUCT_HOME=/home/openjdk/sources/jdk8_tl/build/linux-amd64_backup/j2sdk-image
ZIP_DEBUGINFO_FILES=0

InfraBuild

LANG=C
PRODUCT_HOME=/home/openjdk/sources/jdk8_tl/build/linux-x64-normal-server-release/images/j2sdk-image
ZIP_DEBUGINFO_FILES=0
ARCH_DATA_MODEL=64

Special steps for creating the C/C++ projects
Add the ‘make’ folder to the Source folder list by going to Project Properties > C/C++ General / Paths and Symbols > Source Location.
Click on Add Folder and select the /hotspot/make folder.

Then add an external builder by going to Project Properties > Builders
Click on New, choose Program as configuration type, click on OK and
call the builder ‘make_hotspot’.
Link it to /usr/bin/make by setting as the Location
Make the Working directory point to the make or makefiles folder – use the Browse Filesystem option to locate it.

For you’d like to do incremental builds, add “all” to the arguments list and for a full build add “clean” and “all” in order to determine the build action each time the project is built:
clean
all

Finally add the above enlisted environment variables for Hotspot to the External Builder configuration under the Environment tab, as shown in the screen-shot (enlists two of the remaining expected environment variables):

External Builder Environment Settings

External Builder Environment Settings

Now under Project Properties > C/C++ build
Under the Builder Settings tab
Check Generate Makefiles Automatically
Select External Builder as the Builder type (see at the top of the tab)
Keep the Generate Makefiles automatically checked.
Click on Apply.

Now Go to the Behaviour tab and uncheck the options [Build on resource save…] and [Build(incremental build)] options to ensure that the External builder configuration is used when the build action is invoked on a project.

Click on OK.

Build the project via the Project > Build Project option and wait for it to finish. At this point the project should be in a state to be build by either the Internal or the External Builder.

See How to build the Hotspot project for a more detailed approach.

g) Corba

Do the same actions as above for Corba as for the Hotspot project and translate names and locations that relate to the Corba project.

Build the project via the Project > Build Project option and wait for it to finish. At this point the project should be in a state to be build by either the Internal or the External Builder.

See How to build the Corba project for a more detailed approach.

h) JDK — sub-projects – Old Build

This build depends on successful builds of the Hotspot, JAX, JAXWS, Langtools and Corba packages.

Add the below lines to the build.properties file in case it does not already exist:

make.options=\
ALT_BOOTDIR=/usr/lib/jvm/java-7-openjdk-amd64 \
ALT_LANGTOOLS_DIST=~/sources/jdk8_tl/langtools/dist \
ALT_CORBA_DIST=~/sources/jdk8_tl/corba/dist \
ALT_JAXP_DIST=~/sources/jdk8_tl/jaxp/dist \
ALT_JAXWS_DIST=~/sources/jdk8_tl/jaxws/dist \
HOTSPOT_IMPORT_PATH=~/sources/jdk8_tl/hotspot/build/linux-amd64_backup/hotspot/import \
ALT_HOTSPOT_IMPORT_PATH=~/sources/jdk8_tl/hotspot/build/linux-amd64_backup/hotspot/import \
HOTSPOT_SERVER_PATH=~/sources/jdk8_tl/build/linux-amd64_backup/hotspot/import/jre/lib/amd64/server \
ALT_HOTSPOT_SERVER_PATH=~/sources/jdk8_tl/build/linux-amd64_backup/hotspot/import/jre/lib/amd64/server \
OPENJDK=true

Import a project using the Import a project from existing Ant BuildFile option via the New > Project > Other menu option and select ~/sources/jdk8_tl/jdk/make/eclipse/j2se/build.xml file for the j2se project.

Alternatively the whole project can be loaded as a C/C++ project following the same instructions as for the Hotspot project above. The above environment variables settings will be required to be added to the project.

Build the project via the Project > Build Project option and wait for it to finish.

Once all of these packages are build successfully, close down Eclipse.

For sanity check purposes please ensure that the individual packages / projects build from the command prompt and the OpenJDK project as a whole builds from the command prompt i.e.

cd ~/sources/jdk8_tl/[...project..]/make/    or   cd ~/sources/jdk8_tl/jdk/make/[...project..]/
make all

This is to ensure one process hasn’t negatively impacted the other. For a list of project names refer to the section at the top of page.

See How to build the JDK sub-projects – Old build for a more detailed approach.

i) JDK — sub-projects – Infrabuild

Do the same actions as above for the Hotspot project and translate names and locations that relate to the OpenJDK project. Import the project from /home/openjdk/sources/jdk8_tl/common/makefiles folder, rename the project to something like ‘OpenJDK’ instead of the ‘makefiles’ name given by default. Depending on the build system used, apply the below environment variable and argument list settings (does not require the traditional ALT_ ones anymore – instead they should all be unset).

LANG=C
PRODUCT_HOME=/home/openjdk/sources/jdk8_tl/build/linux-x64-normal-server-release/images/j2sdk-image
ZIP_DEBUGINFO_FILES=0

When setting up the External Builder setting for the Infrabuild system, you can apply one of the below entries in the arguments list:

images
clean images
images VERBOSE=”-d -p”
clean images VERBOSE=”-d -p”

Note: Please refer to http://openjdk.java.net/projects/build-infra/guide.html for further details on what the arguments mean in the Infrabuild system.

The rest of the instructions to setup the Hotspot project apply. Build the project via the Project > Build Project option and wait for it to finish. The above results in creation of the j2re and j2sdk images in the /jdk8_tl/build/linux-x64-normal-server-release/images folder (in folders j2re-image and j2sdk-image respectively).

Note: you can jump start the process by using the ready-made projects made available in the Github repo mentioned below. A clean build using the InfraBuild system would take an hour on a VM and under 15 minutes on a native machine – all this subject to various performance factors of a system. The Old build system is slower in this respect.

See How to build the JDK sub-projects – Infrabuild for a more detailed approach.


Full OpenJDK build Logs

See Full OpenJDK build Logs to view the logs produced from a full build action.


Full OpenJDK build output

See Full build output: artefacts for details on build outputs (artefacts) produced from a full build action.


Newly created Eclipse projects

Now you have a number of ready created Eclipse projects in your Eclipse workspace.

Go to the workspace folder where Eclipse creates the project files and settings i.e. into the /home/openjdk/workspace folder.

There are a number of folders each representing the project you have been working on, in each folder there are atleast two to four hidden files and folders such as:

.project
.cproject
.classpath
.settings

Copy these over to the folders for each of the respective packages under the ecproject folder in the respective folders as shown below.

~/sources/jdk8_tl/jaxp/ecproject/
~/sources/jdk8_tl/jaxws/ecproject/
~/sources/jdk8_tl/jdk/make/eclipse/j2se/ecproject/
~/sources/jdk8_tl/jdk/make/eclipse/jconsole/ecproject/
~/sources/jdk8_tl/jdk/make/eclipse/jmx/ecproject/
~/sources/jdk8_tl/jdk/make/eclipse/swing/ecproject/
~/sources/jdk8_tl/langtools/make/eclipse/langtools/ecproject/

An archive of the list of files that were modified across the openJDK system can be found at (include .cproject, .project, build.xml, build.properties, etc… files):

Github repo with directory structure of files and folders (last updated 18/09/2012) for both the Old Build and the InfraBuild system

In the Infrabuild folder in the repo, two new projects namely openJDK (images) and openJDK (images VERBOSE) have been included to build the whole OpenJDK system via the InfraBuild system.

Here is a list of packages which were not built during the process, they all have the necessary make and build scripts in them:

awt2d
jarzip
jdwpgen
world

Please provide any feedback on areas that work differently for you. If you have fixed an issue also let us know so that we can update the information here.

This blog has been inspired by a couple of JUG members working on the OpenJDK project, below are links leading upto them. Their instructions and systemic approach have helped in the process:

Netbeans projects for OpenJDK – Martijn Verburg
Netbeans projects for OpenJDK – Andrii Rodionov
Hacking Hotspot in Eclipse – Roman Kennke
Adding include paths to your C/C++ project – Roland
Sachin Handiekar – Screen-shots and other improvements.
Girish Balakrishnan – for reviewing and feedback on the Hotspot instructions.
Jonathan Gibbons (from the OpenJDK team) – Thanks for reviewing the Langtools section.

A number of other online resources and videos have also helped in the process.