Deploying Spack Environments on HPCs
Once you have updated Spack dependencies and bumped the Polaris version, you must deploy and test the new environments on supported HPC systems. This ensures compatibility with system modules and successful builds of E3SM components.
This page outlines the deployment workflow, key files, command-line flags, and best practices for deploying and validating Spack environments in Polaris.
Deployment Workflow
Deployment is managed via ./deploy.py (backed by mache.deploy) and associated
infrastructure. The process is typically:
Update configuration files:
Set the target version in
polaris/version.pyUpdate package pins in
deploy/pins.cfgUpdate Spack specs in
deploy/spack.yaml.j2if package specs changedUpdate machine configs in
polaris/machines/as needed
Test the build on one or more HPC machines:
SCRATCH=<path_to_scratch> mkdir -p $SCRATCH/tmp_spack ./deploy.py --deploy-spack --spack-path $SCRATCH/test_spack \ --compiler intel intel gnu --mpi openmpi impi openmpi --recreate
Adjust
--compilerand--mpias needed for your machine and test matrix.You may want to use
screenortmuxand pipe output to a log file:./deploy.py ... 2>&1 | tee deploy.log
Check output and validate:
Spack built the expected packages
Pixi environment was created and activated
Activation scripts were generated and symlinked correctly
Permissions have been updated successfully
Test E3SM component builds and workflows using the new environment
Deploy more broadly once core systems pass testing
Key Deployment Components
deploy.py: Main entry point for deploying Polaris environments. Handles pixi deployment and optional Spack deployment throughmache.deploy.deploy/pins.cfg: Pin versions for pixi and Spack packages.deploy/config.yaml.j2: Deployment behavior and machine/runtime settings consumed bymache.deploy.deploy/spack.yaml.j2: Jinja2 template for Spack specs.deploy/hooks.py: Polaris-specific deployment hooks used bymache.deploy.Mache deploy docs: authoritative behavior and option details: https://docs.e3sm.org/mache/main/users_guide/deploy.html
Common Command-Line Flags
--deploy-spack: Build or rebuild Spack environments.--spack-path <path>: Path to Spack checkout used for deployment/testing.--compiler <compiler(s)>: Specify compiler(s) to build for.--mpi <mpi(s)>: Specify MPI library/libraries.--recreate: Recreate environments even if they exist.--mache-forkand--mache-branch: Use a specific fork/branch ofmache(for co-development/testing).
See ./deploy.py --help for the full list.
If needed, set Spack temporary build location with spack.tmpdir in
deploy/config.yaml.j2.
Notes and Best Practices
Use a unique Spack install location for testing (
--spack-path).Use a scratch or group directory for Spack’s temporary build files.
Set
spack.tmpdirindeploy/config.yaml.j2if you need to control temporary build location.Only deploy shared Spack environments after thorough testing.
Check
albany_supported.txtandpetsc_supported.txtfor supported machine/compiler/MPI combos.For troubleshooting, see Troubleshooting Deployment.
➡ Next: Troubleshooting Deployment