Migrate from firehose to input source ingestion (legacy)

Apache deprecated support for Druid firehoses in version 0.17. Support for firehose ingestion was removed in version 26.0.

Firehose ingestion doesn’t work with newer Druid versions, so you must be using an ingestion spec with a defined input source before you upgrade.

Migrate from firehose ingestion to an input source

To migrate from firehose ingestion, you can use the Druid console to update your ingestion spec, or you can update it manually.

Use the Druid console

To update your ingestion spec using the Druid console, open the console and copy your spec into the Edit spec stage of the data loader.

Druid converts the spec into one with a defined input source. For example, it converts the example firehose ingestion spec below into the example ingestion spec after migration.

If you’re unable to use the console or you have problems with the console method, the alternative is to update your ingestion spec manually.

Update your ingestion spec manually

To update your ingestion spec manually, copy your existing spec into a new file. Refer to Native batch ingestion with firehose (Deprecated) for a description of firehose properties.

Edit the new file as follows:

  1. In the ioConfig component, replace the firehose definition with an inputSource definition for your chosen input source. See Native batch input sources for details.
  2. Move the timeStampSpec definition from parser.parseSpec to the dataSchema component.
  3. Move the dimensionsSpec definition from parser.parseSpec to the dataSchema component.
  4. Move the format definition from parser.parseSpec to an inputFormat definition in ioConfig.
  5. Delete the parser definition.
  6. Save the file. You can check the format of your new ingestion file against the migrated example below.
  7. Test the new ingestion spec with a temporary data source.
  8. Once you’ve successfully ingested sample data with the new spec, stop firehose ingestion and switch to the new spec.

When the transition is complete, you can upgrade Druid to the latest version. See the Druid release notes for upgrade instructions.

Example firehose ingestion spec

An example firehose ingestion spec is as follows:

  1. {
  2. "type" : "index",
  3. "spec" : {
  4. "dataSchema" : {
  5. "dataSource" : "wikipedia",
  6. "metricsSpec" : [
  7. {
  8. "type" : "count",
  9. "name" : "count"
  10. },
  11. {
  12. "type" : "doubleSum",
  13. "name" : "added",
  14. "fieldName" : "added"
  15. },
  16. {
  17. "type" : "doubleSum",
  18. "name" : "deleted",
  19. "fieldName" : "deleted"
  20. },
  21. {
  22. "type" : "doubleSum",
  23. "name" : "delta",
  24. "fieldName" : "delta"
  25. }
  26. ],
  27. "granularitySpec" : {
  28. "type" : "uniform",
  29. "segmentGranularity" : "DAY",
  30. "queryGranularity" : "NONE",
  31. "intervals" : [ "2013-08-31/2013-09-01" ]
  32. },
  33. "parser": {
  34. "type": "string",
  35. "parseSpec": {
  36. "format": "json",
  37. "timestampSpec" : {
  38. "column" : "timestamp",
  39. "format" : "auto"
  40. },
  41. "dimensionsSpec" : {
  42. "dimensions": ["country", "page","language","user","unpatrolled","newPage","robot","anonymous","namespace","continent","region","city"],
  43. "dimensionExclusions" : []
  44. }
  45. }
  46. }
  47. },
  48. "ioConfig" : {
  49. "type" : "index",
  50. "firehose" : {
  51. "type" : "local",
  52. "baseDir" : "examples/indexing/",
  53. "filter" : "wikipedia_data.json"
  54. }
  55. },
  56. "tuningConfig" : {
  57. "type" : "index",
  58. "partitionsSpec": {
  59. "type": "single_dim",
  60. "partitionDimension": "country",
  61. "targetRowsPerSegment": 5000000
  62. }
  63. }
  64. }
  65. }

Example ingestion spec after migration

The following example illustrates the result of migrating the example firehose ingestion spec to a spec with an input source:

  1. {
  2. "type" : "index",
  3. "spec" : {
  4. "dataSchema" : {
  5. "dataSource" : "wikipedia",
  6. "timestampSpec" : {
  7. "column" : "timestamp",
  8. "format" : "auto"
  9. },
  10. "dimensionsSpec" : {
  11. "dimensions": ["country", "page","language","user","unpatrolled","newPage","robot","anonymous","namespace","continent","region","city"],
  12. "dimensionExclusions" : []
  13. },
  14. "metricsSpec" : [
  15. {
  16. "type" : "count",
  17. "name" : "count"
  18. },
  19. {
  20. "type" : "doubleSum",
  21. "name" : "added",
  22. "fieldName" : "added"
  23. },
  24. {
  25. "type" : "doubleSum",
  26. "name" : "deleted",
  27. "fieldName" : "deleted"
  28. },
  29. {
  30. "type" : "doubleSum",
  31. "name" : "delta",
  32. "fieldName" : "delta"
  33. }
  34. ],
  35. "granularitySpec" : {
  36. "type" : "uniform",
  37. "segmentGranularity" : "DAY",
  38. "queryGranularity" : "NONE",
  39. "intervals" : [ "2013-08-31/2013-09-01" ]
  40. }
  41. },
  42. "ioConfig" : {
  43. "type" : "index",
  44. "inputSource" : {
  45. "type" : "local",
  46. "baseDir" : "examples/indexing/",
  47. "filter" : "wikipedia_data.json"
  48. },
  49. "inputFormat": {
  50. "type": "json"
  51. }
  52. },
  53. "tuningConfig" : {
  54. "type" : "index",
  55. "partitionsSpec": {
  56. "type": "single_dim",
  57. "partitionDimension": "country",
  58. "targetRowsPerSegment": 5000000
  59. }
  60. }
  61. }
  62. }

Learn more

For more information, see the following pages: