Deep storage

Deep storage is where segments are stored. It is a storage mechanism that Apache Druid does not provide. This deep storage infrastructure defines the level of durability of your data. As long as Druid processes can see this storage infrastructure and get at the segments stored on it, you will not lose data no matter how many Druid nodes you lose. If segments disappear from this storage layer, then you will lose whatever data those segments represented.

In addition to being the backing store for segments, you can use query from deep storage and run queries against segments stored primarily in deep storage. The load rules you configure determine whether segments exist primarily in deep storage or in a combination of deep storage and Historical processes.

Deep storage options

Druid supports multiple options for deep storage, including blob storage from major cloud providers. Select the one that fits your environment.

Local

Local storage is intended for use in the following situations:

  • You have just one server.
  • Or, you have multiple servers, and they all have access to a shared filesystem (for example: NFS).

In multi-server production clusters, rather than local storage with a shared filesystem, it is instead recommended to use cloud-based deep storage (Amazon S3, Google Cloud Storage, or Azure Blob Storage), S3-compatible storage (like Minio), or HDFS. These options are generally more convenient, more scalable, and more robust than setting up a shared filesystem.

The following configurations in common.runtime.properties apply to local storage:

PropertyPossible ValuesDescriptionDefault
druid.storage.typelocalMust be set.
druid.storage.storageDirectoryany local directoryDirectory for storing segments. Must be different from druid.segmentCache.locations and druid.segmentCache.infoDir./tmp/druid/localStorage
druid.storage.ziptrue, falseWhether segments in druid.storage.storageDirectory are written as directories (false) or zip files (true).false

For example:

  1. druid.storage.type=local
  2. druid.storage.storageDirectory=/tmp/druid/localStorage

The druid.storage.storageDirectory must be set to a different path than druid.segmentCache.locations or druid.segmentCache.infoDir.

Amazon S3 or S3-compatible

See druid-s3-extensions.

Google Cloud Storage

See druid-google-extensions.

Azure Blob Storage

See druid-azure-extensions.

HDFS

See druid-hdfs-storage extension documentation.

Additional options

For additional deep storage options, please see our extensions list.

Querying from deep storage

Although not as performant as querying segments stored on disk for Historical processes, you can query from deep storage to access segments that you may not need frequently or with the extreme low latency Druid queries traditionally provide. You trade some performance for a total lower storage cost because you can access more of your data without the need to increase the number or capacity of your Historical processes.

For information about how to run queries, see Query from deep storage.