Examples

This is a collection of example projects which are designed to illustrate how to implement certain applications and solutions with signac. Unlike the tutorial, the examples consist mainly of complete, immediately executable source code with less explanation.

Note

A more comprehensive set of examples is available in the signac-examples repository.

Ideal Gas

This example is based on the Tutorial and assumes that we want to model a system using the ideal gas law:

\[p V = N k_B T\]

The data space is initialized for a specific system size \(N\), thermal energy \(kT\), and pressure \(p\) in a script called init.py:

# init.py
import signac

project = signac.init_project('ideal-gas-project')

for p in range(1, 11):
    sp = {'p': p, 'kT': 1.0, 'N': 1000}
    job = project.open_job(sp)
    job.init()

The workflow consists of a compute_volume operation that computes the volume based on the given parameters and stores it within a file called V.txt within each job’s workspace directory. The two additional operations copy the result into a JSON file called data.json and into the job document under the volume key respectively. All operations are defined in project.py:

# project.py
from flow import FlowProject


@FlowProject.label
def volume_computed(job):
    return job.isfile("volume.txt")


@FlowProject.operation
@FlowProject.post(volume_computed)
def compute_volume(job):
    volume = job.sp.N * job.sp.kT / job.sp.p
    with open(job.fn("volume.txt"), "w") as file:
        file.write(str(volume) + "\n")


@FlowProject.operation
@FlowProject.pre.after(compute_volume)
@FlowProject.post.isfile("data.json")
def store_volume_in_json_file(job):
    import json
    with open(job.fn("volume.txt")) as textfile:
        with open(job.fn("data.json"), "w") as jsonfile:
            data = {"volume": float(textfile.read())}
            jsonfile.write(json.dumps(data) + "\n")


@FlowProject.operation
@FlowProject.pre.after(compute_volume)
@FlowProject.post(lambda job: 'volume' in job.document)
def store_volume_in_document(job):
    with open(job.fn("volume.txt")) as textfile:
        job.document.volume = float(textfile.read())



if __name__ == '__main__':
    FlowProject().main()

The complete workflow can be executed on the command line with $ python project.py run.

MD with HOOMD-blue

This example demonstrates how to setup and analyze the simulation of a Lennard-Jones fluid with molecular dynamics (MD) using HOOMD-blue. The project data space is initialized in a src/init.py script with explicit random seed:

#!/usr/bin/env python
"""Initialize the project's data space.

Iterates over all defined state points and initializes
the associated job workspace directories."""
import logging
import argparse
from hashlib import sha1

import signac
import numpy as np


def main(args, random_seed):
    project = signac.init_project('Lennard-Jones-Fluid-Example-Project')
    for replication_index in range(args.num_replicas):
        for p in np.linspace(0.5, 5.0, 10):
            statepoint = dict(
                    # system size
                    N=512,

                    # Lennard-Jones potential parameters
                    sigma=1.0,
                    epsilon=1.0,
                    r_cut=2.5,

                    # random seed
                    seed=random_seed*(replication_index + 1),

                    # thermal energy
                    kT=1.0,
                    # presure
                    p=p,
                    # thermostat coupling constant
                    tau=1.0,
                    # barostat coupling constant
                    tauP=1.0)
            project.open_job(statepoint).init()


if __name__ == '__main__':
    parser = argparse.ArgumentParser(
        description="Initialize the data space.")
    parser.add_argument(
        'random',
        type=str,
        help="A string to generate a random seed.")
    parser.add_argument(
        '-n', '--num-replicas',
        type=int,
        default=1,
        help="Initialize multiple replications.")
    args = parser.parse_args()

    # Generate an integer from the random str.
    try:
        random_seed = int(args.random)
    except ValueError:
        random_seed = int(sha1(args.random.encode()).hexdigest(), 16) % (10 ** 8)

    logging.basicConfig(level=logging.INFO)
    main(args, random_seed)

Using this script, one replica set (for a given random seed, e.g., 42) can then be initialized with:

$ python src/init.py 42

The simulation and analysis workflow is broken into three operations:

  1. init: Initialize the simulation configuration.
  2. estimate: Use the ideal gas law to estimate the expected volume.
  3. sample: Carry out the simulation with HOOMD-blue.

Those three operations and corresponding condition functions are defined and implemented within a src/project.py module:

"""This module contains the operation functions for this project.

The workflow defined in this file can be executed from the command
line with

    $ python src/project.py run [job_id [job_id ...]]

See also: $ python src/project.py --help
"""
from flow import FlowProject


class Project(FlowProject):
    pass


@Project.operation
@Project.post.isfile('init.gsd')
def initialize(job):
    "Initialize the simulation configuration."
    import hoomd
    from math import ceil
    if hoomd.context.exec_conf is None:
        hoomd.context.initialize('')
    with job:
        with hoomd.context.SimulationContext():
            n = int(ceil(pow(job.sp.N, 1.0/3)))
            assert n**3 == job.sp.N
            hoomd.init.create_lattice(unitcell=hoomd.lattice.sc(a=1.0), n=n)
            hoomd.dump.gsd('init.gsd', period=None, group=hoomd.group.all())


@Project.operation
@Project.post(lambda job: 'volume_estimate' in job.document)
def estimate(job):
    "Find volume predicted by ideal gas law as an estimate of the LJ fluid."
    # Since we are not simulating an ideal gas our simulation volume
    # may be different than the volume calculated here
    V = job.sp.N * job.sp.kT / job.sp.p
    job.document.volume_estimate = V


def current_step(job):
    import gsd.hoomd
    if job.isfile('dump.gsd'):
        with gsd.hoomd.open(job.fn('dump.gsd')) as traj:
            return traj[-1].configuration.step
    return -1


@Project.label
def sampled(job):
    return current_step(job) >= 5000


@Project.label
def progress(job):
    return '{}/4'.format(int(round(current_step(job) / 5000) * 4))


@Project.operation
@Project.pre.after(initialize)
@Project.post(sampled)
def sample(job):
    "Sample operation."
    import logging
    import hoomd
    from hoomd import md
    if hoomd.context.exec_conf is None:
        hoomd.context.initialize('')
    with job:
        with hoomd.context.SimulationContext():
            hoomd.init.read_gsd('init.gsd', restart='restart.gsd')
            group = hoomd.group.all()
            gsd_restart = hoomd.dump.gsd(
                'restart.gsd', truncate=True, period=100, phase=0, group=group)
            lj = md.pair.lj(r_cut=job.sp.r_cut, nlist=md.nlist.cell())
            lj.pair_coeff.set('A', 'A', epsilon=job.sp.epsilon, sigma=job.sp.sigma)
            md.integrate.mode_standard(dt=0.005)
            md.integrate.npt(
                group=group, kT=job.sp.kT, tau=job.sp.tau,
                P=job.sp.p, tauP=job.sp.tauP)
            hoomd.analyze.log('dump.log', ['volume'], period=100)
            hoomd.dump.gsd('dump.gsd', period=100, group=hoomd.group.all())
            try:
                hoomd.run_upto(5001)
            except hoomd.WalltimeLimitReached:
                logging.warning("Reached walltime limit.")
            finally:
                gsd_restart.write_restart()
                job.document['sample_step'] = hoomd.get_step()


if __name__ == '__main__':
    Project().main()

There are two additional label functions, which show whether the simulation has finished (sampled) and one that shows the rough progress in quarters (progress).

Execute the initialization and simulation with:

$ python src/project.py run

Integration with Sacred

Integrating a sacred experiment with signac-flow is very simple. Assuming the following sacred experiment defined in a experiment.py module:

from sacred import Experiment

ex = Experiment()


@ex.main
def hello(foo):
    print('hello', foo)


if __name__ == '__main__':
    ex.run_commandline()

Then we can integrate that experiment on a per job basis like this:

from flow import FlowProject
from sacred.observers import FileStorageObserver

from experiment import ex


class SacredProject(FlowProject):
    pass


@SacredProject.operation
def run_experiment(job):
    ex.add_config(** job.sp())
    ex.observers[:] = [FileStorageObserver.create(job.fn('my_runs'))]
    ex.run()


if __name__ == '__main__':
    SacredProject().main()