Logging

注解

scrapy.log has been deprecated alongside its functions in favor ofexplicit calls to the Python standard logging. Keep reading to learn moreabout the new logging system.

Scrapy uses Python’s builtin logging system for event logging. We’llprovide some simple examples to get you started, but for more advanceduse-cases it’s strongly suggested to read thoroughly its documentation.

Logging works out of the box, and can be configured to some extent with theScrapy settings listed in Logging settings.

Scrapy calls scrapy.utils.log.configure_logging() to set some reasonabledefaults and handle those settings in Logging settings whenrunning commands, so it’s recommended to manually call it if you’re runningScrapy from scripts as described in 在脚本中运行Scrapy.

Log levels

Python’s builtin logging defines 5 different levels to indicate severity on agiven log message. Here are the standard ones, listed in decreasing order:

  • logging.CRITICAL - for critical errors (highest severity)
  • logging.ERROR - for regular errors
  • logging.WARNING - for warning messages
  • logging.INFO - for informational messages
  • logging.DEBUG - for debugging messages (lowest severity)

How to log messages

Here’s a quick example of how to log a message using the logging.WARNINGlevel:

  1. import logging
  2. logging.warning("This is a warning")

There are shortcuts for issuing log messages on any of the standard 5 levels,and there’s also a general logging.log method which takes a given level asargument. If you need so, last example could be rewrote as:

  1. import logging
  2. logging.log(logging.WARNING, "This is a warning")

On top of that, you can create different “loggers” to encapsulate messages (Forexample, a common practice it’s to create different loggers for every module).These loggers can be configured independently, and they allow hierarchicalconstructions.

Last examples use the root logger behind the scenes, which is a top levellogger where all messages are propagated to (unless otherwise specified). Usinglogging helpers is merely a shortcut for getting the root loggerexplicitly, so this is also an equivalent of last snippets:

  1. import logging
  2. logger = logging.getLogger()
  3. logger.warning("This is a warning")

You can use a different logger just by getting its name with thelogging.getLogger function:

  1. import logging
  2. logger = logging.getLogger('mycustomlogger')
  3. logger.warning("This is a warning")

Finally, you can ensure having a custom logger for any module you’re working onby using the name variable, which is populated with current module’spath:

  1. import logging
  2. logger = logging.getLogger(__name__)
  3. logger.warning("This is a warning")

参见

Module logging, HowTo
Basic Logging Tutorial
Module logging, Loggers
Further documentation on loggers

Logging from Spiders

Scrapy provides a logger within each Spiderinstance, that can be accessed and used like this:

  1. import scrapy
  2.  
  3. class MySpider(scrapy.Spider):
  4.  
  5. name = 'myspider'
  6. start_urls = ['http://scrapinghub.com']
  7.  
  8. def parse(self, response):
  9. self.logger.info('Parse function called on %s', response.url)

That logger is created using the Spider’s name, but you can use any customPython logger you want. For example:

  1. import logging
  2. import scrapy
  3.  
  4. logger = logging.getLogger('mycustomlogger')
  5.  
  6. class MySpider(scrapy.Spider):
  7.  
  8. name = 'myspider'
  9. start_urls = ['http://scrapinghub.com']
  10.  
  11. def parse(self, response):
  12. logger.info('Parse function called on %s', response.url)

Logging configuration

Loggers on their own don’t manage how messages sent through them are displayed.For this task, different “handlers” can be attached to any logger instance andthey will redirect those messages to appropriate destinations, such as thestandard output, files, emails, etc.

By default, Scrapy sets and configures a handler for the root logger, based onthe settings below.

Logging settings

These settings can be used to configure the logging:

LOG_LEVEL determines the minimum level of severity to display, thosemessages with lower severity will be filtered out. It ranges through thepossible levels listed in Log levels.

LOG_FORMAT and LOG_DATEFORMAT specify formatting stringsused as layouts for all messages. Those strings can contain any placeholderslisted in logging’s logrecord attributes docs anddatetime’s strftime and strptime directivesrespectively.

Command-line options

There are command-line arguments, available for all commands, that you can useto override some of the Scrapy settings regarding logging.

参见

Module logging.handlers
Further documentation on available handlers

scrapy.utils.log module