View Results
Last Updated:2020-11-11
- In "Product Service>MapReduce>Baidu MapReduce-Job List", click job name to view necessary information of job.
- Hadoop divides job into some tasks to process, and task types are map tasks and reduce tasks. Click drop-down angle bracket to view task processing conditions.
- In case of failure, every task has four attempts, and information on every attempt is recorded in "View Attempt". Click "View Attempt" corresponding to "MAP" and "REDUCE" to view attempt information.
-
You can click "Job Log" to view job logs, and the Spark job has no logs. The logs to show are syslog, stderr, and stdout.
-
Syslog records details on job running;
-
stderr records errors on job running;
-
stdout records outputs after the job is completed;
-