hawq check
Verifies and validates HAWQ platform settings.
Synopsis
hawq check -f <hostfile_hawq_check> | (-h <hostname> | --host <hostname>)
[--hadoop <hadoop_home> | --hadoop-home <hadoop_home>]
[--config <config_file>]
[--stdout | --zipout]
[--kerberos]
[--hdfs-ha]
[--yarn]
[--yarn-ha]
hawq check --zipin <hawq_check_zipfile>
hawq check --version
hawq check -?
Description
The hawq check
utility determines the platform on which you are running HAWQ and validates various platform-specific configuration settings as well as HAWQ and HDFS-specific configuration settings. In order to perform HAWQ configuration checks, make sure HAWQ has been already started and hawq config
works. For HDFS checks, you should either set the $HADOOP_HOME
environment variable or provide the full path to the hadoop installation location using the --hadoop
option.
The hawq check
utility can use a host file or a file previously created with the --zipout
option to validate platform settings. If GPCHECK_ERROR
displays, one or more validation checks failed. You can also use hawq check
to gather and view platform settings on hosts without running validation checks. When running checks, hawq check
compares your actual configuration setting with an expected value listed in a config file ($GPHOME/etc/hawq_check.cnf
by default). You must modify your configuration values for “mount.points” and “diskusage.monitor.mounts” to reflect the actual mount points you want to check, as a comma-separated list. Otherwise, the utility only checks the root directory, which may not be helpful.
An example is shown below:
[linux.mount]
mount.points = /,/data1,/data2
[linux.diskusage]
diskusage.monitor.mounts = /,/data1,/data2
Arguments
-f
The name of a file that contains a list of hosts that hawq check
uses to validate platform-specific settings. This file should contain a single host name for all hosts in your HAWQ system (master, standby master, and segments).
-h, —host
Specifies a single host on which platform-specific settings will be validated.
--zipin
Use this option to decompress and check a .zip file created with the --zipout
option. If you specify the --zipin
option, hawq check
performs validation tasks against the specified file.
Options
--config
The name of a configuration file to use instead of the default file $GPHOME/etc/hawq_check.cnf
.
--hadoop, —hadoop-home
Use this option to specify the full path to your hadoop installation location so that hawq check
can validate HDFS settings. This option is not needed if the $HADOOP_HOME
environment variable is set.
--stdout
Send collected host information from hawq check
to standard output. No checks or validations are performed.
--zipout
Save all collected data to a .zip file in the current working directory. hawq check
automatically creates the .zip file and names it hawq_check_timestamp.tar.gz.
No checks or validations are performed.
--kerberos
Use this option to check HDFS and YARN when running Kerberos mode. This allows hawq check
to validate HAWQ/HDFS/YARN settings with Kerberos enabled.
--hdfs-ha
Use this option to indicate that HDFS-HA mode is enabled, allowing hawq check
to validate HDFS settings with HA mode enabled.
--yarn
If HAWQ is using YARN, enables yarn mode, allowing hawq check
to validate the basic YARN settings.
--yarn-ha
Use this option to indicate HAWQ is using YARN with High Availability mode enabled, to allow hawq check
to validate HAWQ-YARN settings with YARN-HA enabled.
--version
Displays the version of this utility.
-? (help)
Displays the online help.
Examples
Verify and validate the HAWQ platform settings by entering a host file and specifying the full hadoop install path:
$ hawq check -f hostfile_hawq_check --hadoop /usr/hdp/<version>/hadoop
Verify and validate the HAWQ platform settings with HDFS HA enabled, YARN HA enabled and Kerberos enabled:
$ hawq check -f hostfile_hawq_check --hadoop /usr/local/hadoop-<version> --hdfs-ha --yarn-ha --kerberos
Verify and validate the HAWQ platform settings with HDFS HA enabled, and Kerberos enabled:
$ hawq check -f hostfile_hawq_check --hadoop /usr/hdp/<version>/hadoop --hdfs-ha --kerberos
Save HAWQ platform settings to a zip file, when the $HADOOP_HOME
environment variable is set:
$ hawq check -f hostfile_hawq_check --zipout
Verify and validate the HAWQ platform settings using a zip file created with the --zipout
option:
$ hawq check --zipin hawq_check_timestamp.tar.gz
View collected HAWQ platform settings:
$ hawq check -f hostfile_hawq_check --hadoop /usr/local/hadoop-<version> --stdout