You are browsing documentation for an older version. See the latest documentation here.

Add Plugin Testing

The following is a guide for setting up a testing environment for Kong Gateway custom plugins.

Prerequisites

This page is the second chapter in the Getting Started guide for developing custom plugins. These instructions refer to the previous chapter in the guide and require the same developer tool prerequisites.

Step by step

Now that you have a basic plugin project, you can build testing automations for it.

Install Pongo

Pongo is a tool that helps you validate and distribute custom plugins for Kong Gateway. Pongo uses Docker to bootstrap a Kong Gateway environment allowing you to quickly load your plugin, run automated testing, and manually validate the plugin’s behavior against various Kong Gateway versions.

The following script can automate the installation of Pongo for you. If you prefer, you can follow the manual installation instructions instead.

If you already have Pongo installed, you can skip to the next step or run the install script to update Pongo to the latest version.

Run the following to install or update Pongo:

  1. curl -Ls https://get.konghq.com/pongo | bash

For the remainder of this guide to work properly, the pongo command must be present in your system path. The script and manual installation instructions above both include hints for putting pongo on your path.

Ensure that the pongo command is available in your PATH by running the command within your project directory:

  1. pongo help

With Pongo installed, you can now set up a test environment for your new plugin.

Initialize the test environment

Pongo lets you validate a plugin’s behavior by giving you tools to quickly run a Kong Gateway with the plugin installed and available.

Let’s validate the plugin manually first, and then you will add automated tests in subsequent steps of this guide.

Note: Kong Gateway runs in a variety of deployment topologies. By default, Pongo runs Kong Gateway in traditional mode, which uses a database to store configured entities such as routes, services, and plugins. Kong Gateway and the database are run in separate containers, letting you cycle the gateway independently of the database. This enables a quick and iterative approach to validating the plugin’s logical behavior while keeping the gateway state independent in the database.

Pongo provides an optional command that initializes the project directory with some default configuration files. You can run it to start a new project.

Important: These commands must be run inside the my-plugin project root directory so that Pongo properly packages and includes the plugin code in the running Kong Gateway.

Initialize the project folder:

  1. pongo init

Now you can start dependency containers for Kong Gateway. By default, this only includes the Postgres database used in traditional mode.

Start the dependencies:

  1. pongo up

Once the dependencies are running successfully, you can run a Kong Gateway container and open a shell within it to interact with the gateway. Pongo runs a Kong Gateway container with various CLI tools pre-installed to help with testing.

Launch the gateway and open a shell with:

  1. pongo shell

Your terminal is now running a shell inside the Kong Gateway container. Your shell prompt should change, showing you the gateway version, the host plugin directory, and current path inside the container. For example, your prompt may look like the following:

  1. [Kong-3.6.1:my-plugin:/kong]$

Pongo provides some aliases to assist with the lifecycle of Kong Gateway and the database. On its first run, you need to initialize the database and start Kong Gateway. Pongo provides the kms alias to perform this common task.

Run the database migrations and start Kong Gateway:

  1. kms

You should see a success message that Kong Gateway has started:

  1. ...
  2. 64 migrations processed
  3. 64 executed
  4. Database is up-to-date
  5. Kong started

As mentioned previously, Pongo installs some development tools to help us test your plugin. You can now validate that the plugin is installed by querying the Admin API using curl and filtering the response with jq:

  1. curl -s localhost:8001 | \
  2. jq '.plugins.available_on_server."my-plugin"'

You should see a response that matches the information in the plugin’s table:

  1. {
  2. "priority": 1000,
  3. "version": "0.0.1"
  4. }

With the test environment initialized, you can now manually run the plugin code.

Manually test plugin

With the plugin installed, you can now configure Kong Gateway entities to invoke and validate the plugin’s behavior.

Note: For each of the following POST requests to the Admin API, you should receive a HTTP/1.1 201 Created response from Kong Gateway indicating the successful creation of the entity.

Still within the Kong Gateway container’s shell, add a new service with the following:

  1. curl -i -s -X POST http://localhost:8001/services \
  2. --data name=example_service \
  3. --data url='http://httpbin.org'

Associate the custom plugin with the example_service service:

  1. curl -is -X POST http://localhost:8001/services/example_service/plugins \
  2. --data 'name=my-plugin'

Add a new route for sending requests through the example_service:

  1. curl -i -X POST http://localhost:8001/services/example_service/routes \
  2. --data 'paths[]=/mock' \
  3. --data name=example_route

The plugin is now configured and will be invoked when Kong Gateway proxies requests via the example_service. Prior to forwarding the response from the upstream, the plugin should append the X-MyPlugin header to the list of response headers.

Test the behavior by proxying a request and asking curl to show the response headers with the -i flag:

  1. curl -i http://localhost:8000/mock/anything

curl should report HTTP/1.1 200 OK and show the response headers from the gateway. You should see X-MyPlugin: response in the set of headers, indicating that the plugin’s logic has been invoked.

For example:

  1. HTTP/1.1 200 OK
  2. Content-Type: application/json
  3. Connection: keep-alive
  4. Content-Length: 529
  5. Access-Control-Allow-Credentials: true
  6. Date: Tue, 12 Mar 2024 14:44:22 GMT
  7. Access-Control-Allow-Origin: *
  8. Server: gunicorn/19.9.0
  9. X-MyPlugin: response
  10. X-Kong-Upstream-Latency: 97
  11. X-Kong-Proxy-Latency: 1
  12. Via: kong/3.6.1
  13. X-Kong-Request-Id: 8ab8c32c4782536592994514b6dadf55

Exit the Kong Gateway shell before proceeding:

  1. exit

For quickly getting started, manually validating a plugin using the Pongo shell works nicely. For production scenarios, you will likely want to deploy automated testing and maybe a test-driven development (TDD) methodology. Let’s see how Pongo can help with this as well.

Write a test

Pongo supports running automated tests using the Busted Lua test framework. In plugin projects, the test files reside under the spec/<plugin-name> directory. For this project, this is the spec/my-plugin folder you created earlier.

The following is a code listing for a test that validates the plugin’s current behavior. Copy this code and place it into a new file located at spec/my-plugin/01-integration_spec.lua. See the code comments for details on the design of the test and the test helpers provided by Kong Gateway.

  1. -- Helper functions provided by Kong Gateway, see https://github.com/Kong/kong/blob/master/spec/helpers.lua
  2. local helpers = require "spec.helpers"
  3. -- matches our plugin name defined in the plugins's schema.lua
  4. local PLUGIN_NAME = "my-plugin"
  5. -- Run the tests for each strategy. Strategies include "postgres" and "off"
  6. -- which represent the deployment topologies for Kong Gateway
  7. for _, strategy in helpers.all_strategies() do
  8. describe(PLUGIN_NAME .. ": [#" .. strategy .. "]", function()
  9. -- Will be initialized before_each nested test
  10. local client
  11. setup(function()
  12. -- A BluePrint gives us a helpful database wrapper to
  13. -- manage Kong Gateway entities directly.
  14. -- This function also truncates any existing data in an existing db.
  15. -- The custom plugin name is provided to this function so it mark as loaded
  16. local blue_print = helpers.get_db_utils(strategy, nil, { PLUGIN_NAME })
  17. -- Using the BluePrint to create a test route, automatically attaches it
  18. -- to the default "echo" service that will be created by the test framework
  19. local test_route = blue_print.routes:insert({
  20. paths = { "/mock" },
  21. })
  22. -- Add the custom plugin to the test route
  23. blue_print.plugins:insert {
  24. name = PLUGIN_NAME,
  25. route = { id = test_route.id },
  26. }
  27. -- start kong
  28. assert(helpers.start_kong({
  29. -- use the custom test template to create a local mock server
  30. nginx_conf = "spec/fixtures/custom_nginx.template",
  31. -- make sure our plugin gets loaded
  32. plugins = "bundled," .. PLUGIN_NAME,
  33. }))
  34. end)
  35. -- teardown runs after its parent describe block
  36. teardown(function()
  37. helpers.stop_kong(nil, true)
  38. end)
  39. -- before_each runs before each child describe
  40. before_each(function()
  41. client = helpers.proxy_client()
  42. end)
  43. -- after_each runs after each child describe
  44. after_each(function()
  45. if client then client:close() end
  46. end)
  47. -- a nested describe defines an actual test on the plugin behavior
  48. describe("The response", function()
  49. it("gets the expected header", function()
  50. -- invoke a test request
  51. local r = client:get("/mock/anything", {})
  52. -- validate that the request succeeded, response status 200
  53. assert.response(r).has.status(200)
  54. -- now validate and retrieve the expected response header
  55. local header_value = assert.response(r).has.header("X-MyPlugin")
  56. -- validate the value of that header
  57. assert.equal("response", header_value)
  58. end)
  59. end)
  60. end)
  61. end

With this test code, Pongo can help automate testing.

Run the test

Pongo can run automated tests with the pongo run command. When this is executed, Pongo determines if dependency containers are already running and will use them if they are. The test library handles truncating existing data in between test runs for us.

Execute a test run:

  1. pongo run

You should see a successful report that looks similar to the following:

  1. [pongo-INFO] auto-starting the test environment, use the 'pongo down' action to stop it
  2. Kong version: 3.6.1
  3. [==========] Running tests from scanned files.
  4. [----------] Global test environment setup.
  5. [----------] Running tests from /kong-plugin/spec/my-plugin/01-integration_spec.lua
  6. [----------] Running tests from /kong-plugin/spec/my-plugin/01-integration_spec.lua
  7. [ RUN ] /kong-plugin/spec/my-plugin/01-integration_spec.lua:63: my-plugin: [#postgres] The response gets a 'X-MyPlugin' header
  8. [ OK ] /kong-plugin/spec/my-plugin/01-integration_spec.lua:63: my-plugin: [#postgres] The response gets a 'X-MyPlugin' header (6.59 ms)
  9. [ RUN ] /kong-plugin/spec/my-plugin/01-integration_spec.lua:63: my-plugin: [#off] The response gets a 'X-MyPlugin' header
  10. [ OK ] /kong-plugin/spec/my-plugin/01-integration_spec.lua:63: my-plugin: [#off] The response gets a 'X-MyPlugin' header (4.76 ms)
  11. [----------] 2 tests from /kong-plugin/spec/my-plugin/01-integration_spec.lua (23022.12 ms total)
  12. [----------] Global test environment teardown.
  13. [==========] 2 tests from 1 test file ran. (23022.80 ms total)
  14. [ PASSED ] 2 tests.

Pongo can also run as part of a Continuous Integration (CI) system. See the repository documentation for more details.

What’s next?

With the project setup and automated testing in place, the next chapter will walk you through adding configurable values to the plugin.


Previous Set Up a Plugin Project

Next Add Plugin Configuration