- Apache Pig 教程
- Apache Pig - 首页
- Apache Pig 简介
- Apache Pig - 概述
- Apache Pig - 架构
- Apache Pig 环境
- Apache Pig - 安装
- Apache Pig - 执行
- Apache Pig - Grunt Shell
- Pig Latin
- Pig Latin - 基础
- 加载与存储操作符
- Apache Pig - 读取数据
- Apache Pig - 存储数据
- 诊断操作符
- Apache Pig - 诊断操作符
- Apache Pig - Describe 操作符
- Apache Pig - Explain 操作符
- Apache Pig - Illustrate 操作符
- Pig Latin 内置函数
- Apache Pig - Eval 函数
- 加载与存储函数
- Apache Pig - Bag 与 Tuple 函数
- Apache Pig - 字符串函数
- Apache Pig - 日期时间函数
- Apache Pig - 数学函数
- Apache Pig 有用资源
- Apache Pig - 快速指南
- Apache Pig - 有用资源
- Apache Pig - 讨论
Apache Pig - 诊断操作符
load 语句会将数据简单地加载到 Apache Pig 中指定的关联中。要验证Load语句的执行,必须使用诊断操作符。Pig Latin 提供四种不同类型的诊断操作符:
- Dump 操作符
- Describe 操作符
- Explain 操作符
- Illustrate 操作符
本章将讨论 Pig Latin 的 Dump 操作符。
Dump 操作符
Dump 操作符用于运行 Pig Latin 语句并在屏幕上显示结果。它通常用于调试目的。
语法
以下是Dump操作符的语法。
grunt> Dump Relation_Name
示例
假设我们在 HDFS 中有一个文件student_data.txt,内容如下。
001,Rajiv,Reddy,9848022337,Hyderabad 002,siddarth,Battacharya,9848022338,Kolkata 003,Rajesh,Khanna,9848022339,Delhi 004,Preethi,Agarwal,9848022330,Pune 005,Trupthi,Mohanthy,9848022336,Bhuwaneshwar 006,Archana,Mishra,9848022335,Chennai.
我们使用 LOAD 操作符将其读取到名为student的关联中,如下所示。
grunt> student = LOAD 'hdfs://127.0.0.1:9000/pig_data/student_data.txt' USING PigStorage(',') as ( id:int, firstname:chararray, lastname:chararray, phone:chararray, city:chararray );
现在,让我们使用Dump 操作符打印关联的内容,如下所示。
grunt> Dump student
执行上述Pig Latin语句后,它将启动一个 MapReduce 作业来从 HDFS 读取数据。它将产生以下输出。
2015-10-01 15:05:27,642 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete 2015-10-01 15:05:27,652 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics: HadoopVersion PigVersion UserId StartedAt FinishedAt Features 2.6.0 0.15.0 Hadoop 2015-10-01 15:03:11 2015-10-01 05:27 UNKNOWN Success! Job Stats (time in seconds): JobId job_14459_0004 Maps 1 Reduces 0 MaxMapTime n/a MinMapTime n/a AvgMapTime n/a MedianMapTime n/a MaxReduceTime 0 MinReduceTime 0 AvgReduceTime 0 MedianReducetime 0 Alias student Feature MAP_ONLY Outputs hdfs://127.0.0.1:9000/tmp/temp580182027/tmp757878456, Input(s): Successfully read 0 records from: "hdfs://127.0.0.1:9000/pig_data/ student_data.txt" Output(s): Successfully stored 0 records in: "hdfs://127.0.0.1:9000/tmp/temp580182027/ tmp757878456" Counters: Total records written : 0 Total bytes written : 0 Spillable Memory Manager spill count : 0Total bags proactively spilled: 0 Total records proactively spilled: 0 Job DAG: job_1443519499159_0004 2015-10-01 15:06:28,403 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLau ncher - Success! 2015-10-01 15:06:28,441 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not set... will not generate code. 2015-10-01 15:06:28,485 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1 2015-10-01 15:06:28,485 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1 (1,Rajiv,Reddy,9848022337,Hyderabad) (2,siddarth,Battacharya,9848022338,Kolkata) (3,Rajesh,Khanna,9848022339,Delhi) (4,Preethi,Agarwal,9848022330,Pune) (5,Trupthi,Mohanthy,9848022336,Bhuwaneshwar) (6,Archana,Mishra,9848022335,Chennai)
广告