11.6. HTTP/FTP Proxy

An HTTP/FTP proxy acts as an intermediary for HTTP and/or FTP connections. Its role is twofold:

  • Caching: recently downloaded documents are copied locally, which avoids multiple downloads.

  • Filtering server: if use of the proxy is mandated (and outgoing connections are blocked unless they go through the proxy), then the proxy can determine whether or not the request is to be granted.

Falcot Corp selected Squid as their proxy server.

11.6.1. 安装

The squid[3] Debian package only contains the modular (caching) proxy. Turning it into a filtering server requires installing the additional squidguard package. In addition, squid-cgi provides a querying and administration interface for a Squid proxy.

Prior to installing, care should be taken to check that the system can identify its own complete name: the hostname -f must return a fully-qualified name (including a domain). If it does not, then the /etc/hosts file should be edited to contain the full name of the system (for instance, arrakis.falcot.com). The official computer name should be validated with the network administrator in order to avoid potential name conflicts.

11.6.2. Configuring a Cache

Enabling the caching server feature is a simple matter of editing the /etc/squid/squid.conf configuration file and allowing machines from the local network to run queries through the proxy. The following example shows the modifications made by the Falcot Corp administrators:

例 11.22. The /etc/squid/squid.conf file (excerpts)

  1. # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
  2. #
  3. include /etc/squid/conf.d/*
  4.  
  5. # Example rule allowing access from your local networks.
  6. # Adapt localnet in the ACL section to list your (internal) IP networks
  7. # from where browsing should be allowed
  8.  
  9. acl our_networks src 192.168.1.0/24 192.168.2.0/24
  10. http_access allow our_networks
  11. http_access allow localhost
  12. # And finally deny all other access to this proxy
  13. http_access deny all

11.6.3. Configuring a Filter

squid itself does not perform the filtering; this action is delegated to squidGuard. The former must then be configured to interact with the latter. This involves adding the following directive to the /etc/squid/squid.conf file:

  1. url_rewrite_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf

The /usr/lib/cgi-bin/squidGuard.cgi CGI program also needs to be installed, using /usr/share/doc/squidguard/examples/squidGuard.cgi.gz as a starting point. Required modifications to this script are the $proxy and $proxymaster variables (the name of the proxy and the administrator’s contact email, respectively). The $image and $redirect variables should point to existing images representing the rejection of a query.

The filter is enabled with the service squid reload command. However, since the squidguard package does no filtering by default, it is the administrator’s task to define the policy. This can be done by creating the /etc/squid/squidGuard.conf file (using /etc/squidguard/squidGuard.conf.default as template if required).

The working database must be regenerated with update-squidguard after each change of the squidGuard configuration file (or one of the lists of domains or URLs it mentions). The configuration file syntax is documented on the following website:

http://www.squidguard.org/Doc/configure.html

ALTERNATIVE E2guardian (a DansGuardian Fork)

The e2guardian package, a DansGuardian fork, is an alternative to squidguard. This software does not simply handle a blacklist of forbidden URLs, but it can take advantage of the PICS[4] (Platform for Internet Content Selection) to decide whether a page is acceptable by dynamic analysis of its contents.


[3] The squid3 package, providing Squid until Debian Jessie, is now a transitional package and will automatically install squid.

[4] PICS has been superseded by the Protocol for Web Description Resources (POWDER system: https://www.w3.org/2009/08/pics_superseded.html.