Dask and Jupyter

Parallel python with dask and jupyter The dask framework provides an incredibly useful environment for parallel execution of python code in interactive settings (e.g. jupyter) or batch mode. Its key features are (from what I’ve seen so far): Representation of threading, multiprocessing, and distributed computing with one unified API and CLI. Abstraction of HPC schedulers (PBS, Moab, SLURM, …) Data structures for distributed computing with pandas and numpy syntax Dask-jobqueue The package dask_jobqueue seems to me to be the most userfriendly if it comes to parallelization on HPC clusters with a scheduling system such as SLURM. [Read More]

Running matlab code on HPC with SLURM

Running MATLAB scripts on HPC Today, the question came up how to run MATLAB code on HPC featuring a SLURM scheduler. The syntax for running matlab on the command line is indeed a bit counterintuitive, at least if you are (like me) used to running python or R scripts. Example SLURM script The following snippet is an example for how to submit a matlab script for execution on an HPC Server with the SLURM scheduler: [Read More]