Incorporate MATLAB Map and Reduce Functions into a Hadoop MapReduce Job

Create a deployable archive of MATLAB® map and reduce functions, and incorporate it into a Hadoop® mapreduce job

To incorporate MATLAB map and reduce functions into a Hadoop mapreduce job, you create a deployable archive from the map and reduce functions and pass the archive as a payload argument to a job submitted to the Hadoop cluster. A deployable archive contains a:

  • mapper function written in MATLAB.

  • reducer function written in MATLAB.

  • MAT-file containing a datastore that describes the structure of the data and variables to be analyzed.

  • Hadoop settings file that identifies the map and reduce functions, the type of data being analyzed, and other configuration details.

For more information, see Workflow to Incorporate MATLAB Map and Reduce Functions into a Hadoop Job

Functions

deploytoolCompile and package functions for external deployment
mccCompile MATLAB functions for deployment

Topics

Workflow to Incorporate MATLAB Map and Reduce Functions into a Hadoop Job

Instructions on how to create a deployable archive of MATLAB map and reduce functions and incorporate it into a Hadoop mapreduce job.

Example Using the mcc Command Workflow

Use the mcc command to create a deployable archive of MATLAB map and reduce functions. You can pass the deployable archive as a payload argument to a job submitted to a Hadoop cluster.

Configuration File for Creating Deployable Archive Using the mcc Command

Create a configuration file that represents the characteristics of the payload to the Hadoop mapreduce job. You will need this file to create a deployable archive using the mcc command.

Related Information

MapReduce (MATLAB)

Datastore (MATLAB)