Incorporate MATLAB Map and Reduce Functions into a Hadoop MapReduce Job

Create a deployable archive of MATLAB® map and reduce functions, and incorporate it into a Hadoop® mapreduce job

To incorporate MATLAB map and reduce functions into a Hadoop mapreduce job, you create a deployable archive from the map and reduce functions and pass the archive as a payload argument to a job submitted to the Hadoop cluster. A deployable archive contains a:

  • mapper function written in MATLAB.

  • reducer function written in MATLAB.

  • MAT-file containing a datastore that describes the structure of the data and variables to be analyzed.

  • Hadoop settings file that identifies the map and reduce functions, the type of data being analyzed, and other configuration details.

For more information, see Workflow to Incorporate MATLAB Map and Reduce Functions into a Hadoop Job

Functions

deploytoolOpen a list of application deployment apps
mccCompile MATLAB functions for deployment

Topics

Workflow to Incorporate MATLAB Map and Reduce Functions into a Hadoop Job

Review workflow on how to create a deployable archive of MATLAB map and reduce functions and incorporate it into a Hadoop mapreduce job.

Example on Incorporating MATLAB Map and Reduce Functions into a Hadoop Job

Try an example on creating a deployable archive of MATLAB map and reduce functions, and incorporate it into a Hadoop mapreduce job.

Configuration File for Creating Deployable Archive Using the mcc Command

Create a configuration file that represents the characteristics of the payload to the Hadoop mapreduce job. You will need this file to create a deployable archive using the mcc command.

Related Information