celery.contrib.batches
Experimental task class that buffers messages and processes them as a list.
警告
For this to work you have to set CELERYD_PREFETCH_MULTIPLIER to zero, or some value where the final multiplied value is higher than flush_every.
In the future we hope to add the ability to direct batching tasks to a channel with different QoS requirements than the task channel.
Simple Example
A click counter that flushes the buffer every 100 messages, and every seconds. Does not do anything with the data, but can easily be modified to store it in a database.
# Flush after 100 messages, or 10 seconds.
@app.task(base=Batches, flush_every=100, flush_interval=10)
def count_click(requests):
from collections import Counter
count = Counter(request.kwargs['url'] for request in requests)
for url, count in count.items():
print('>>> Clicks: {0} -> {1}'.format(url, count))
Then you can ask for a click to be counted by doing:
>>> count_click.delay('http://example.com')
Example returning results
An interface to the Web of Trust API that flushes the buffer every 100 messages, and every 10 seconds.
import requests
from urlparse import urlparse
from celery.contrib.batches import Batches
wot_api_target = "https://api.mywot.com/0.4/public_link_json"
@app.task(base=Batches, flush_every=100, flush_interval=10)
def wot_api(requests):
sig = lambda url: url
reponses = wot_api_real(
(sig(*request.args, **request.kwargs) for request in requests)
)
# use mark_as_done to manually return response data
for response, request in zip(reponses, requests):
app.backend.mark_as_done(request.id, response)
def wot_api_real(urls):
domains = [urlparse(url).netloc for url in urls]
response = requests.get(
wot_api_target,
params={"hosts": ('/').join(set(domains)) + '/'}
)
return [response.json[domain] for domain in domains]
Using the API is done as follows:
>>> wot_api.delay('http://example.com')
注解
If you don’t have an app instance then use the current app proxy instead:
from celery import current_app
app.backend.mark_as_done(request.id, response)
API
class celery.contrib.batches.Batches[源代码]
Strategy(task, app, consumer)[源代码]
apply_buffer(requests, args=(), kwargs={})[源代码]
flush(requests)[源代码]
flush_every = 10
Maximum number of message in buffer.
flush_interval = 30
Timeout in seconds before buffer is flushed anyway.
run(requests)[源代码]
class celery.contrib.batches.SimpleRequest(id, name, args, kwargs, delivery_info, hostname)[源代码]
Pickleable request.
args = ()
positional arguments
delivery_info = None
message delivery information.
classmethod from_request(request)[源代码]
hostname = None
worker node name
id = None
task id
kwargs = {}
keyword arguments
name = None
task name