Minimum user id error while submitting mapreduce job
Hello everyone! Hope you are enjoying our blogs on crazyadmins.com
This is a small blog to help you all solve this minimum user id error while submitting mapreduce job in Hadoop.
Application application_XXXXXXXXX_XXXX failed 2 times due to AM Container for appattempt_ XXXXXXXXX_XXXX _XXXXXX exited with exitCode: -1000 For more detailed output, check application tracking page:http://<your RM host>:8088/proxy/application_ XXXXXXXXX_XXXX /Then, click on links to logs of each attempt. Diagnostics: Application application_ XXXXXXXXX_XXXX initialization failed (exitCode=255) with output: Requested user hive is not whitelisted and has id 501, which is below the minimum allowed 1000 Failing this attempt. Failing the application.
This means that there is a property (for minimum allowed UID value) which has been set with value 1000. In above example, it appears that the hive user has UID as 501 which is less than 1000.
To solve this, we either need to:
- Update UID of user hive to a unique value greater than or equal to 1000
- Update the property value to 500 so that hive user UID meets the minimum value.
We will go with option 2 here i.e. Update the property value to 500 so that hive user UID meets the minimum value.
If you are using Ambari to manage your Hortonworks cluster, then:
1. Login to Ambari UI with user having privileges to edit configurations.
2. Navigate to YARN configurations.
3. Go to Advanced yarn-env.sh
4. Update “Minimum user ID for submitting job” to 500
Resubmit your job now and it should just run fine!
Similarly if you are using Cloudera Manager, find the same property in YARN configurations and update it.
If you are not using any of the Management UIs, then you can try finding this property in the conf directory where your hadoop conf files are located. Usually you will find them in /etc/hadoop/conf location.