Migrate from firehose to input source ingestion (legacy)
Apache deprecated support for Druid firehoses in version 0.17. Support for firehose ingestion was removed in version 26.0.
Firehose ingestion doesn’t work with newer Druid versions, so you must be using an ingestion spec with a defined input source before you upgrade.
Migrate from firehose ingestion to an input source
To migrate from firehose ingestion, you can use the Druid console to update your ingestion spec, or you can update it manually.
Use the Druid console
To update your ingestion spec using the Druid console, open the console and copy your spec into the Edit spec stage of the data loader.
Druid converts the spec into one with a defined input source. For example, it converts the example firehose ingestion spec below into the example ingestion spec after migration.
If you’re unable to use the console or you have problems with the console method, the alternative is to update your ingestion spec manually.
Update your ingestion spec manually
To update your ingestion spec manually, copy your existing spec into a new file. Refer to Native batch ingestion with firehose (Deprecated) for a description of firehose properties.
Edit the new file as follows:
- In the
ioConfig
component, replace thefirehose
definition with aninputSource
definition for your chosen input source. See Native batch input sources for details. - Move the
timeStampSpec
definition fromparser.parseSpec
to thedataSchema
component. - Move the
dimensionsSpec
definition fromparser.parseSpec
to thedataSchema
component. - Move the
format
definition fromparser.parseSpec
to aninputFormat
definition inioConfig
. - Delete the
parser
definition. - Save the file. You can check the format of your new ingestion file against the migrated example below.
- Test the new ingestion spec with a temporary data source.
- Once you’ve successfully ingested sample data with the new spec, stop firehose ingestion and switch to the new spec.
When the transition is complete, you can upgrade Druid to the latest version. See the Druid release notes for upgrade instructions.
Example firehose ingestion spec
An example firehose ingestion spec is as follows:
{
"type" : "index",
"spec" : {
"dataSchema" : {
"dataSource" : "wikipedia",
"metricsSpec" : [
{
"type" : "count",
"name" : "count"
},
{
"type" : "doubleSum",
"name" : "added",
"fieldName" : "added"
},
{
"type" : "doubleSum",
"name" : "deleted",
"fieldName" : "deleted"
},
{
"type" : "doubleSum",
"name" : "delta",
"fieldName" : "delta"
}
],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : "NONE",
"intervals" : [ "2013-08-31/2013-09-01" ]
},
"parser": {
"type": "string",
"parseSpec": {
"format": "json",
"timestampSpec" : {
"column" : "timestamp",
"format" : "auto"
},
"dimensionsSpec" : {
"dimensions": ["country", "page","language","user","unpatrolled","newPage","robot","anonymous","namespace","continent","region","city"],
"dimensionExclusions" : []
}
}
}
},
"ioConfig" : {
"type" : "index",
"firehose" : {
"type" : "local",
"baseDir" : "examples/indexing/",
"filter" : "wikipedia_data.json"
}
},
"tuningConfig" : {
"type" : "index",
"partitionsSpec": {
"type": "single_dim",
"partitionDimension": "country",
"targetRowsPerSegment": 5000000
}
}
}
}
Example ingestion spec after migration
The following example illustrates the result of migrating the example firehose ingestion spec to a spec with an input source:
{
"type" : "index",
"spec" : {
"dataSchema" : {
"dataSource" : "wikipedia",
"timestampSpec" : {
"column" : "timestamp",
"format" : "auto"
},
"dimensionsSpec" : {
"dimensions": ["country", "page","language","user","unpatrolled","newPage","robot","anonymous","namespace","continent","region","city"],
"dimensionExclusions" : []
},
"metricsSpec" : [
{
"type" : "count",
"name" : "count"
},
{
"type" : "doubleSum",
"name" : "added",
"fieldName" : "added"
},
{
"type" : "doubleSum",
"name" : "deleted",
"fieldName" : "deleted"
},
{
"type" : "doubleSum",
"name" : "delta",
"fieldName" : "delta"
}
],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : "NONE",
"intervals" : [ "2013-08-31/2013-09-01" ]
}
},
"ioConfig" : {
"type" : "index",
"inputSource" : {
"type" : "local",
"baseDir" : "examples/indexing/",
"filter" : "wikipedia_data.json"
},
"inputFormat": {
"type": "json"
}
},
"tuningConfig" : {
"type" : "index",
"partitionsSpec": {
"type": "single_dim",
"partitionDimension": "country",
"targetRowsPerSegment": 5000000
}
}
}
}
Learn more
For more information, see the following pages:
- Ingestion: Overview of the Druid ingestion process.
- Native batch ingestion: Description of the supported native batch indexing tasks.
- Ingestion spec reference: Description of the components and properties in the ingestion spec.