Archive for : November, 2015

Setup SQL Based authorization in hive

In this tutorial we will see how to setup SQL Based authorization in hive.

 

Step 1 – Goto ambari UI and add/modify below properties

 

Goto service hive → configs and change autherization to SQLStdAuth

 

 

Step 2 – In Hive-site.xml, make sure you have set below properties:

 

hive.server2.enable.doAs --> false
hive.users.in.admin.role --> root (comma separated list of users)

 

Step 3 – Make sure that you have below properties set in Hiveserver2-site.xml

 

hive.security.authorization.manager --> org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory
hive.security.authorization.enabled --> true
hive.security.authenticator.manager --> org.apache.hadoop.hive.ql.security.ProxyUserAuthenticator

 

Step 4 – Restart hive services from ambar UI

 

Step 5 – Testing our setup, in this we will create one readonly user and try to drop table.

 

5.1 Login to beeline using root ( as we have added root in hive.users.in.admin.role )

 

0: jdbc:hive2://localhost:10010> !connect jdbc:hive2://localhost:10010Connecting to jdbc:hive2://localhost:10010
Enter username for jdbc:hive2://localhost:10010: root
Enter password for jdbc:hive2://localhost:10010: **** 
Connected to: Apache Hive (version 1.2.1.2.3.2.0-2950)Driver: Hive JDBC (version 1.2.1.2.3.2.0-2950)Transaction isolation: 
TRANSACTION_REPEATABLE_READ1: jdbc:hive2://localhost:10010>

 

5.2 Now by default there is only one role – public, you need to run below command to set your role as ADMIN.

 

0: jdbc:hive2://localhost:10010> SHOW CURRENT ROLES;
+---------+--+
|  role  |
+---------+--+
| public  |
+---------+--+

 

5.3 Set role as admin for user root

 

 

1: jdbc:hive2://localhost:10010> set role ADMIN;
No rows affected (0.445 seconds)1: 
jdbc:hive2://localhost:10010> show roles;
+---------+--+
|  role  |
+---------+--+
| admin  |
| public  |
+---------+--+
2 rows selected (0.165 seconds)

 

5.4 Create new role readonly and add readonly_user in that group

 

0: jdbc:hive2://slave1.hortonworks.com:10010/> create role readonly;
No rows affected (0.071 seconds)

 

5.5 Verify that new role has been created successfully

 

0: jdbc:hive2://slave1.hortonworks.com:10010/> show roles;
+-----------+--+
|  role  |
+-----------+--+
| admin  |
| public  |
| readonly  |
+-----------+--+
3 rows selected (0.051 seconds)
0: jdbc:hive2://slave1.hortonworks.com:10010/>

 

5.6 Add readonly_user into readonly role

 

5: jdbc:hive2://slave1.hortonworks.com:10010> grant role readonly to user readonly_user;
No rows affected (0.088 seconds)
5: jdbc:hive2://slave1.hortonworks.com:10010>

 

5.7 Grant select privileges to role readonly

 

5: jdbc:hive2://slave1.hortonworks.com:10010> grant select on table batting to role readonly;
No rows affected (0.405 seconds)
5: jdbc:hive2://slave1.hortonworks.com:10010>

 

5.8 Verify grants for role readonly

 

0: jdbc:hive2://slave1.hortonworks.com:10010/> show grant role readonly;
+-----------+----------+------------+---------+-----------------+-----------------+------------+-------
-------+----------------+----------+--+
| database  |  table  | partition  | column  | principal_name  | principal_type  | privilege  | grant_option  |  grant_time  | grantor  |
+-----------+----------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| default  | batting  |  |  | readonly  | ROLE  | SELECT  | false  | 1447877696000  | root  |
+-----------+----------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
1 row selected (0.06 seconds)

 

5.9 Now login to beeline by user readonly_user and try to drop table batting

 

beeline> !connect jdbc:hive2://slave1.hortonworks.com:10010/
Connecting to jdbc:hive2://slave1.hortonworks.com:10010/
Enter username for jdbc:hive2://slave1.hortonworks.com:10010/: readonly_user
Enter password for jdbc:hive2://slave1.hortonworks.com:10010/: ********
Connected to: Apache Hive (version 1.2.1.2.3.2.0-2950)Driver: Hive JDBC (version 1.2.1.2.3.2.0-2950)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://slave1.hortonworks.com:10010/> drop table batting;
Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: Principal [name=readonly_user, type=USER] does not have following privileges for operation DROPTABLE [[OBJECT OWNERSHIP] on Object [type=TABLE_OR_VIEW, name=default.batting]] (state=42000,code=40000)

 

Note – we are getting an error here because readonly_user does not have permission to drop table batting!

 

5.10 Let’s try to access some rows from table batting

 

0: jdbc:hive2://slave1.hortonworks.com:10010/> select * from batting limit 5;
+--------------------+---------------+---------------+--+
| batting.player_id  | batting.year  | batting.runs  |
+--------------------+---------------+---------------+--+
| playerID  | NULL  | NULL  |
| aardsda01  | 2004  | 0  |
| aardsda01  | 2006  | 0  |
| aardsda01  | 2007  | 0  |
| aardsda01  | 2008  | 0  |
+--------------------+---------------+---------------+--+
5 rows selected (0.775 seconds)
0: jdbc:hive2://slave1.hortonworks.com:10010/>

 

We can see that grants are working and user can see the contents but cannot delete the table.

 

Please post comments if you need any help! :-)

 

facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Developing custom maven plugin using Java5 annotations

Maven provides lots of built-in plugins for developers but at some point you may find need of custom maven plugin. Developing custom maven plugin using java5 annotations is very simple and straightforward.

 

d

 

You just need to follow below steps to develop custom maven plugin using Java 5 annotations:

 

Steps:

 

1. Create a new project with pom packaging set to “maven-pom”

 

2. Add below dependencies to your plugin pom:

 

i.Maven-plugin-api dependency helps for developing mojos required by custom maven plugin.

<dependency>
            <groupId>org.apache.maven</groupId>
            <artifactId>maven-plugin-api</artifactId>
</dependency>

 

ii. Since Maven 3.0 version we can use java 5 annotations to develop custom plugin.with annotations it is not necessasry that mojo super class should be in the same project if your super class also uses annotations. To use annotations in mojos add below dependency to your plugin pom file.

<dependency>
            <groupId>org.apache.maven.plugin-tools</groupId>
            <artifactId>maven-plugin-annotations</artifactId>
</dependency>

 

iii.Below dependency is used to not only read Maven project object model files, but to assemble inheritence and to retrieve remote models as required.

<dependency>
            <groupId>org.apache.maven</groupId>
            <artifactId>maven-project</artifactId>
            <version>2.0.6</version>
</dependency>

 

iv. If you want to add any test cases or any other 3rd party dependencies add them.

 

3.Maven plugin tools looks classes with @Mojo annotation any class annotated with @Mojo will be added to plugin configuration file.

 

Eg:

 

import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;
import org.apache.maven.plugin.MojoFailureException;
import org.apache.maven.plugins.annotations.Mojo;
@Mojo(name="simplePlugin")
public class CustomMojo extends AbstractMojo{
            @override
            public void execute() throws MojoExecutionException, MojoFailureException {
            getLog().info("Successfully created custom maven plugin");
                        /*
                        * your businees logic goes here
                        */
            }
}

 

The “name” parameter of mojo annotation is your plugins name,your plugin will be recognised with this name.You mojo class extends AbstractMojo class.AbstractMojo class implements mojo interface and set logging for your plugin.AbstractMojo sets log4j based logging getLog() method provides info,error,debug,warn levels of logging.

 

Mojo interface is having execute method which will contain business logic of plugin.execute method throws 2 kinds of execptions:

 

i. MojoFailureException : If expected probelm occurs throwing this exception causes Build Failure message to be displayed.Throwing this exception causes build failure.

 

ii. MojoExecutionException : If unexcepted probelm occurs throwing this exception causes Build error message to be displayed.

 

4.You can execute your plugin from command line by providing following command:

 

mvn pluginGroupId:artifactID:version:mojoName

to shorten the command to be executed for plugin add below lines to maven’s settings.xml file in pluginGroups section. This will tell maven to search repository for this groupID:

<pluginGroups><pluginGroup>plugin group id</pluginGroup></pluginGroups>

After this you can run your plugin simply by providing goal prefix and mojo name command to run plugin will be like this:

mvn goalPrefix:mojoName

 

5. configuring goalPrefix:

To Create goalPrefix add plugin maven-plugin-plugin to maven plugin pom.It is used to create plugin descriptor for any mojo’s found in source tree to include in jar.it can be used for generating report files for mojo’s updating plugin registry.

Eg:

<build>
                        <plugins>
                                    <plugin>
                                                <groupId>org.apache.maven.plugins</groupId>
                                                <artifactId>maven-plugin-plugin</artifactId>
                                                <version>3.4</version>
                                                <configuration>
                                                            <skipErrorNoDescriptorsFound>true</skipErrorNoDescriptorsFound>
                                                            <goalPrefix>your goalPrefix</goalPrefix>
                                                            <parameter1>custom param1</parameter1>
                                                            <parameter2>custom param2</parameter2>
                                                </configuration>
                                                <executions>
                                                            <execution>
                                                                        <id>mojo-descriptor</id>
                                                                        <phase>process-classes</phase>
                                                                        <goals>
                                                                                    <goal>descriptor</goal>
                                                                        </goals>
                                                            </execution>
                                                </executions>
                                    </plugin>
</build>

 

6. You can pass external parameters to your plugin from command line and also you can set default values for you parameters if they are not send from command line.

 

Eg:

@Parameter(property = "param1", defaultValue = "abc")
            private String type;

 

command to run plugin by passing parameter is :

mvn goalPrefix:mojoName -Dparam1='acd';

 

if you set required parameter of property to false then there is no compulsion of passing parameter from command line.

As we all know that maven has default structure of scanning source files in src/main/java structure and test files in src/test folded,similarly if you want your plugin to scan files in particular folder in your project structure then you can do this by adding org.apache.maven.project.MavenProject property

to your project.

Eg:

            @Parameter(defaultValue = "${project}", readonly = true, required = true)
            private MavenProject project;

 

you can also set default values to your parameters by setting parameter property name tag in maven-plugin-plugin plugin’s configuration section.if you don’t want to set default property then keep these custom property fields blank.these property fields are the property values you mentioned in mojo.

 

9. Using your plugin in main project: for using your plugin in another project add plugins dependency in build section of your project.

Eg:

 

        <build>
       <plugins>
         <plugin>
           <groupId>plugin's group id</groupId>
           <artifactId>>plugin's artifact id</artifactId>
           <version>versin of plugin</version>
         </plugin>
       </plugins>
     </build>

 

10. To release plugin copy plugin’s jar and other dependent jars from your m2 repository and release it to client or qa.

 

Here we are done with developing custom maven plugin with java 5 annotations.please let me know if you have any doubts or suggestions.The sample project is present on github you can download it from below link:

 

https://github.com/omtonape/CustomPlugin-MavenJava5Annotations.git

 

facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

Unable to delete STORM REST API service component after hdp upgrade

Unable to delete STORM REST API service component after hdp upgrade to hdp2.2.0.0 ? Relax! You are at right place, this guide will show you how to handle these kind of errors.

 

upgrade

 

Initially I had installed hdp2.1 with ambari 1.7, then I upgraded ambari to 2.1.2 and upgraded hdp stack to 2.2.0.0 as per this documentation.

 

As per below note mentioned in the documentation, I had to delete STORM_UI_SERVER component:

 

“In HDP 2.2, STORM_REST_API component was deleted because the service was moved into STORM_UI_SERVER. When upgrading from HDP 2.1 to 2.2, you must delete this component using the API as follows”

 

I tried deleting it using Ambari REST APIs:

 

First stop the component using ambari API command:

curl -u admin:admin -X PUT -H 'X-Requested-By:1' -d '{"RequestInfo":{"context":"Stop Component"},"Body":{"HostRoles":{"state":"INSTALLED"}}}' http://hdpambari.hortonworks.com:8080/api/v1/clusters/c1/hosts/hdpambari.hortonworks.com/host_components/STORM_REST_API

 

Then Delete using below curl call.

curl -u admin:admin -X DELETE -H 'X-Requested-By:1' http://hdpambari.hortonworks.com:8080/api/v1/clusters/hdpambari/services/STORM/components/STORM_REST_API

 

Please note in above commands:

admin:admin - my username and password for ambari UI
hdpambari.hortonworks.com:8080 - my hostname where ambari-server is installed and port number of ambari server
hdpambari - my cluster name

 

Above commands did not work because of an error given below.

"message" : "org.apache.ambari.server.controller.spi.SystemException: An internal system exception occurred: Could not delete service component from cluster. To remove service component, it must be in DISABLED/INIT/INSTALLED/INSTALL_FAILED/UNKNOWN/UNINSTALLED/INSTALLING state

 

The only option was to remove this component completely from ambari database and restart ambari-server/agent processes.

 

Here is the short summary of what commands I ran:

[root@hdpambari ~]# psql ambari ambari
Password for user ambari:    #default password is "bigdata"
psql (8.4.20)
Type
"help" for help.
ambari=> delete from hostcomponentstate where component_name='STORM_REST_API';
DELETE 1
ambari=> delete from hostcomponentdesiredstate where component_name='STORM_REST_API';
DELETE 1
ambari=> delete from servicecomponentdesiredstate where component_name='STORM_REST_API';
DELETE 1
ambari=> commit;
WARNING: there is no transaction in progress
COMMIT
ambari=> \q
[root@hdpambari ~]#

 

This resolved my issue. hope this helps!

 

facebooktwittergoogle_plusredditpinterestlinkedinmailby feather