parfor
)parfor
on
workers in a parallel pool Parallel Computing Toolbox™ supports interactive parallel
computing and enables you to accelerate your workflow by running on
multiple workers in a parallel pool. Use parfor
to
execute for
-loop iterations in parallel on workers
in a parallel pool. When you have profiled your code and identified
slow for
-loops, try parfor
to
increase your throughput. Develop parfor
-loops
on your desktop and scale up to a cluster without changing your code.
parfor
Discover basic concepts of a parfor
-loop, and decide when
to use it.
Convert for-Loops Into parfor-Loops
Diagnose and fix common parfor
problems.
Ensure That parfor-Loop Iterations are Independent
Unlike a for
-loop, parfor
-loop
Iterations have no guaranteed order.
Nested parfor and for-Loops and Other parfor Requirements
Learn how to deal with parallel nested loops.
Troubleshoot Variables in parfor-Loops
Discover variable requirements and classification in
parfor
-loops.
parfor
-LoopsInteractively Run a Loop in Parallel Using parfor
Convert a slow for
-loop into a
faster parfor
-loop.
Create arrays inside or outside parfor
-loops to speed up
code.
Learn about starting and stopping parallel pools, pool size, and cluster selection.
Specify Your Parallel Preferences
Specify your preferences, and automatically create a parallel pool.
Use Objects and Handles in parfor-Loops
Discover how to use objects, handles, and sliced variables in
parfor
-loops.
Ensure Transparency in parfor-Loops or spmd Statements
All references to variables in parfor
-loops must be
visible in the body of the program.
Scale Up parfor-Loops to Cluster and Cloud
Develop parfor
-loops on your desktop, and scale up to a
cluster without changing your code.
Use parfor-Loops for Reduction Assignments
You can use parfor
-loops to calculate cumulative values
that are updated by each iteration.
Repeat Random Numbers in parfor-Loops
Control random number generation in parfor
-loops by
assigning a particular substream for each iteration.
Use parfor to Train Multiple Deep Learning Networks
This example shows how to use a parfor
loop to perform a parameter sweep on a training option.