Datastore

Read large collections of data

The datastore function creates a datastore, which is a repository for collections of data that are too large to fit in memory. A datastore allows you to read and process data stored in multiple files on a disk, a remote location, or a database as a single entity. If the data is too large to fit in memory, you can manage the incremental import of data, create a tall array to work with the data, or use the datastore as an input to mapreduce for further processing. For more information, see Getting Started with Datastore.

Functions

expand all

datastoreCreate datastore for large collections of data
tabularTextDatastoreDatastore for tabular text files
spreadsheetDatastoreDatastore for spreadsheet files
imageDatastoreDatastore for image data
parquetDatastoreDatastore for collection of Parquet files
fileDatastoreDatastore with custom file reader
arrayDatastoreDatastore for in-memory data
readRead data in datastore
readallRead all data in datastore
previewPreview subset of data in datastore
hasdataDetermine if data is available to read
resetReset datastore to initial state
writeallWrite datastore to files
shuffleShuffle all data in datastore
isShuffleableDetermine whether datastore is shuffleable
numpartitionsNumber of datastore partitions
partitionPartition a datastore
isPartitionableDetermine whether datastore is partitionable

Functions

combineCombine data from multiple datastores
transformTransform datastore

Objects

CombinedDatastoreDatastore to combine data read from multiple underlying datastores
TransformedDatastoreDatastore to transform underlying datastore
KeyValueDatastoreDatastore for key-value pair data for use with mapreduce
TallDatastoreDatastore for checkpointing tall arrays

Classes

expand all

matlab.io.Datastore Base datastore class
matlab.io.datastore.PartitionableAdd parallelization support to datastore
matlab.io.datastore.HadoopLocationBased Add Hadoop support to datastore
matlab.io.datastore.ShuffleableAdd shuffling support to datastore
matlab.io.datastore.DsFileSet File-set object for collection of files in datastore
matlab.io.datastore.DsFileReader File-reader object for files in a datastore
matlab.io.datastore.FileWritableAdd file writing support to datastore
matlab.io.datastore.FoldersPropertyProviderAdd Folder property support to datastore
matlab.io.datastore.FileSet File-set for collection of files in datastore
matlab.io.datastore.BlockedFileSet Blocked file-set for collection of blocks within file

Topics

Getting Started with Datastore

A datastore is an object for reading a single file or a collection of files or data.

Select Datastore for File Format or Application

Choose the right datastore based on the file format of your data or application.

Read and Analyze Large Tabular Text File

This example shows how to create a datastore for a large text file containing tabular data, and then read and process the data one block at a time or one file at a time.

Read and Analyze Image Files

This example shows how to create a datastore for a collection of images, read the image files, and find the images with the maximum average hue, saturation, and brightness (HSV).

Read and Analyze MAT-File with Key-Value Data

This example shows how to create a datastore for key-value pair data in a MAT-file that is the output of mapreduce.

Read and Analyze Hadoop Sequence File

This example shows how to create a datastore for a Sequence file containing key-value data.

Work with Remote Data

Work with remote data in Amazon S3™, Microsoft® Azure® Storage Blob, or HDFS™.

Set Up Datastore for Processing on Different Machines or Clusters

Setup a datastore on your machine that can be loaded and processed on another machine or cluster.

Develop Custom Datastore

Create a fully customized datastore for your custom or proprietary data.

Develop Custom Datastore for DICOM Data

This example shows how to develop a custom datastore that supports writing operations.

Testing Guidelines for Custom Datastores

After implementing your custom datastore, follow this test procedure to qualify your custom datastore.