EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 360+ Courses All in One Bundle
  • Login

jdbc hive

Home » Data Science » Data Science Tutorials » Database Management Tutorial » jdbc hive

jdbc hive

Introduction to jdbc hive

Hive JDBC driver allows the user to manage data in Hadoop via Business intelligence applications through JDBC support. The driver performs this by converting calls from the application to SQL and transit the SQL query to the beneath Hive engine. Mostly, the developers use drivers in Hive to develop desktop, mobile, and web applications to communicate with live data from Hive. The drivers in Hive JDBC have a similar architecture to MySQL drivers and OLEDB drivers, there is no change even in their Resultset objects, connection, and statements. The Hive JDBC is briefly explained in this article.

What is JDBC Hive?

The JDBC Hive driver is one of the components present in the hive client along with the ODBC driver and thrift server. The Hive driver is used to associate the connection between Java application and Hive. The ODBC driver enables the application to connect Hive and ODBC protocol. The features of Hive are managing multiple data format files, accessing SQL data, access to files from HBase and HDFS, query execution via Tez, map-reduce, language support, retrieval of query in LLAP. Hive offers a driver and command-line tool for data operations.

Start Your Free Data Science Course

Hadoop, Data Science, Statistics & others

How to use JDBC Hive?

The working of Hive is simple. In the Hive client end, the application and the Hive driver are connected to the Hive server in Hive services. Then the Hive server is connected to a common driver where all types of file systems can be accessed from the Hadoop cluster and Hive meta-store database which is present in the Hive storage area and compute.

Connection URL- JDBC Hive:

The hive supports connection using URL strings like another database.

::hive1:/ip-address::port    to link other remote applications and Hive

The remote applications can be Java, Scala, Spark, and Python

In the previous version, the syntax is jdbc::hive://

The installation is done in JAR files which are required to manage Hive through JDBC.

Popular Course in this category
Sale
SQL Training Program (7 Courses, 8+ Projects)7 Online Courses | 8 Hands-on Projects | 73+ Hours | Verifiable Certificate of Completion | Lifetime Access
4.5 (8,656 ratings)
Course Price

View Course

Related Courses
PL SQL Training (4 Courses, 2+ Projects)Oracle Training (14 Courses, 8+ Projects)

The below syntax can automatically save the JAR files

hive-jdbc-<<version name>>.jar or hive-services-<<version name>>.jar

When the server for Hive management is configured, then the user should provide the driver class name for JDBC, URL for the database, credentials of the client. Every component should be specified to establish a connection to the SQL database.

The server configuration file to access Hive via JDBC, it should be changed in the below file,

jdbc-site.xml
jdbc:hive1.://serverhive_hostname::serverhive_port//<databasename>
org.hive.jdbc.HiveJDBCDriver

The value of serverhive authentication is given in serverhive.authentication and its impersonation is provided in the hive.serverhive.enable.doas properties.

Though the services of Hive are utilizing the Kerberos authentication or not, it informs the configuration of other server properties. These properties are defined in the above-mentioned .xml config file in Hadoop. The user can change its properties by editing this file.

Use the user identity option to view who is accessing the data from the given server.

HiveServer2

To make remote access of Python, Scala and Java or any programming language, ensure the user has HiveServer2service

It is located in the directory, $Hive_Home \ bin

educba@name:~/hive /bin $  ./hiverserver2

Then the hiveserver2 is initiated.

To connect Java and Scala to Hive, then execute Hive QL from the mvnrepository library. It is dependent on Gradel or Maven. In case, if it is Maven, the user can choose an artefact on the pom.xml file. The version of the artefact and Hive version must be the same to avoid errors.

The new driver class org.hive.jdbc.HiveJDBCDriver

This can work with Hiveserver2 also.

In case if the user is using the previous version, he can choose to work on

org.hadoop.hive.Hivedriver

The connection string should be jdbc::hive//

Connection Hive from Java

The simple commands to access Hive from Java are below. It associates the default database and Hive which are interconnected.

To load the specific Hive use,

Class.forName()

To make connection,

Drivermanager.getconnection()

To get the object statement, use

createstatement()

To execute the query use,

stmt.executequery(“name of query”)

To return to the object connection,

jdbc::hive: // 192.168.1.10000 / default.

Hive from Scala:

To access Hive from Scala, import all the required packages like java.sql. exception,

java.sql. connection,
java.sql, drivermanager;
java.sql resultset;
java.sql statement;
public class
{
Hive JDBC client extends Application
{
value driver name= “org.hadoop.hive.Hivedriver”
class. forname (“EDUCBA”);
value connection = drivermanager. getconnection(“jdbc::hive: // 192.168.1.10000 / default.;)
value stmt = createstatement();
value tablename = “Educba HiveDriver Table”
stmt.queryexecute (“Class1 + Educba”);
value res = stmt.query (“New table” + class 1 + (“key int. value string”);
// select * query name
value sql = “select * from “ + table name;”
res = stmt.query(sql);
while(result.next()
{
system.out.print(string.value)(result.get(1)) + ‘\t” + result.getstring(2)
}
standard hive query
val.sql = choose count(0) from + Educba;
result = stmt.query(sql);
while (result.next())
{
system.out.println ( result.get string(1));
}
}

JDBC hive examples

The Hive has major components like WebHcat and Hcatalog. For storing the data to Hadoop and to enable capabilities of data processing, Hcatalog is engaged such a Pig and Map-reduce. WebHcat is enabled to use Map-reduce, the Hive jobs, and Pig. It can also be used for managing the operations at the Metadata store using REST API and manage all data types for conversion of data. The user can use the connector to fetch the data from Hive. The user can use the query in JDBC to submit a customized SQL query in Hive to fetch the result with help of the connector.

In the authentication NOSASL, the required configuration is jdbc.property.authentication = nosasL

If any user name is provided, then he can use jdbc.user.

There are many additional steps in configuration and it can be changed when it is required, by using the Kerberos authentication. To set a secured cluster in Hive, the user should add the directory comprising hive-site.xml to the client classpath.

Conclusion

The configuration can be changed in its XML file. The JDBC Hive is used in different cases and it can be implemented according to the requirement.

Recommended Articles

This is a guide to jdbc hive. Here we discuss How to use JDBC Hive along with the examples and Connection Hive from Java. You may also have a look at the following articles to learn more –

  1. Hive JDBC Driver
  2. JDBC Driver for Oracle
  3. JDBC Driver
  4. What is JDBC?

All in One Data Science Bundle (360+ Courses, 50+ projects)

360+ Online Courses

50+ projects

1500+ Hours

Verifiable Certificates

Lifetime Access

Learn More

0 Shares
Share
Tweet
Share
Primary Sidebar
Database Management Tutorial
  • DataBase Management
    • Text Data Mining
    • Roles of Database Management System in Industry
    • SQL Server Database Management Tools
    • Database administrator skills
    • Database Management Systems Advantages
    • Database Testing Interview Questions
    • Data Administrator
    • Database Administrator
    • Database Management Software
    • DataStage
    • Types of Database Models
    • Types of Database
    • Hierarchical Database Model
    • Relational Database
    • Relational Database Advantages
    • Operational Database
    • What is RDBMS?
    • What is DB2?
    • Data Masking Tools
    • Database Security
    • Data Replication
    • Bitmap Indexing
    • Second Normal Form
    • Third Normal Form
    • Fourth Normal Form
    • Data Definition Language
    • Data Manipulation Language
    • Data Control Language
    • Transaction Control Language
    • Conceptual Data Model
    • Entity-Relationship Model
    • Relational Database Model
    • Sequential File Organization
    • Checkpoint in DBMS
    • Teradata Create Table
    • Centralized Database
    • Data Storage in Database
    • Thomas write Rule
    • DBA Interview Questions
    • What is JDBC?
    • jdbc hive
    • Apriori Algorithm
    • JDBC Architecture
    • JDBC Interview Questions
    • Wildcard Characters
    • Distributed Database System
    • Multidimensional Database
  • PL/SQL
    • What is PL/SQL?
    • Careers in PL/SQL
    • PLSQL procedure
    • PL/SQL Exception
    • PL/SQL LIKE
    • PL/SQL Raise Exception
    • PLSQL rowtype
    • PLSQL? bind variables
    • PL/SQL Record
    • PL/SQL WITH
    • PL/SQL bulk collect
    • PL/SQL Block Structure
    • PL/SQL else if
    • PL/SQL nvl2
    • PL/SQL Package
    • PL/SQL exists
    • PL/SQL instr
    • PL/SQL listagg
    • PL/ SQL Formatter
    • PLSQLlength
    • PL/SQL Commands
    • PL/SQL Data Types
    • CASE statement in PL/SQL
    • PL/SQL IF Statement
    • Loops in PL/SQL
    • PL/SQL Add Column
    • For Loop in PLSQL
    • PL/SQL Cursor Loop
    • PLSQL Array
    • Cursors in PL/SQL
    • PL/SQL FOR Loop Cursor
    • PL/SQL Queries
    • PL/SQL SELECT INTO
    • PL/SQL TO_CHAR
    • PL/SQL UNION
    • PL/SQL NOT EQUAL
    • PL/SQL varray
    • PL/SQL Concatenate
    • PL/SQL UPDATE
    • PL/SQL TRIM
    • PL/SQL GROUP BY
    • PL/SQL GOTO
    • PL/SQL Date Functions
    • PL/ SQL having
    • PL/SQL to_DATE
    • PL/SQL NVL
    • PLSQL format date
    • PLSQL mod
    • PLSQL round
    • PL/SQL Boolean
    • PL/SQL exit
    • PL/SQL DECODE
    • PL/SQL ROWNUM
    • PLSQL?pivot
    • PLSQL string functions
    • PL/SQL Block
    • PL/SQL Function
    • PL/SQL Unwrapper
    • PL/SQL Table
    • PL/SQL ALTER TABLE
    • PLSQL execute immediate
    • Triggers in PL/SQL
    • PL/SQL Collections
    • PL/SQL stored procedure
    • PL/SQL Anonymous Block
    • PLSQL Interview Questions
  • TSQL Basic
    • TSQL
    • What is T-SQL
    • T-SQL Commands
    • T-SQL String Functions
    • TSQL Interview Questions
  • MariaDB
    • MariaDB Versions
    • MariaDB?list users
    • MariaDB Commands
    • MariaDB Server
    • MariaDB? Data Types
    • MariaDB?boolean
    • MariaDB phpMyAdmin
    • MariaDB Mysqldump
    • MariaDB Java Connector
    • MariaDB insert
    • MariaDB UPDATE
    • MariaDB? rename column
    • MariaDB AUTO_INCREMENT
    • MariaDB Timezone
    • MariaDB GROUP_CONCAT
    • MariaDB wait_timeout
    • MariaDB MaxScale
    • MariaDB? with
    • MariaDB? create?table
    • MariaDB? SHOW TABLES
    • MariaDB alter table
    • MariaDB List Tables
    • MariaDB JSON Functions
    • MariaDB Foreign Key
    • MariaDB? trigger
    • MariaDB Grant All Privileges
    • MariaDB Select Database
    • MariaDB? create database
    • MariaDB Delete Database
    • MariaDB List Databases
    • MariaDB Functions
    • MariaDB? TIMESTAMP
    • MariaDB create user
    • MariaDB add user
    • MariaDB show users
    • MariaDB Delete User
    • MariaDB? change user password
    • MariaDB? change root password
    • MariaDB reset root password
    • MariaDB IF
    • MariaDB bind-address
    • MariaDB Transaction
    • MariaDB Cluster
    • MariaDB Logs
    • MariaDB Encryption
    • MariaDB? backup
    • MariaDB Replication
    • MariaDB max_allowed_packet
    • MariaDB? performance tuning
    • MariaDB export database
    • MariaDB? import SQL
  • SQLite
    • What is SQLite
    • SQLite Commands
    • SQLite Data Types
    • SQLite COUNT
    • SQLite Boolean
    • SQLite autoincrement
    • SQLite select
    • SQLite? Bulk Insert
    • SQLite? add column
    • SQLite? concat
    • SQLite BETWEEN
    • SQLite group by
    • SQLite CASE
    • SQLite group_concat
    • SQLite array
    • SQLite? enum
    • SQLite sum
    • SQLite create table
    • SQLite Alter Table
    • SQLite Create Database
    • SQLite Delete
    • SQLite connection string
    • SQLite Database
    • SQLite Describe Table
    • SQLite Show Tables
    • SQLite exit
    • SQLite create index
    • SQLite foreign key
    • SQLite Stored Procedures
    • SQLite Extension
  • DB2
    • DB2? current date
    • DB2 purescale
    • DB2 backup
    • DB2 restore
    • DB2 C Express
    • DB2 Version
    • DB2? Architecture
    • DB2? Data Types
    • DB2? load
    • DB2? order by
    • DB2 date
    • DB2 NVL
    • DB2? update
    • DB2 warehouse
    • DB2 grant
    • DB2 database
    • DB2 VARCHAR
    • DB2? INSERT
    • DB2 LISTAGG
    • DB2 LIKE
    • DB2 TRUNCATE TABLE
    • DB2 LIST TABLES
    • DB2 between
    • DB2? current timestamp
    • DB2? length
    • DB2? bind
    • DB2 limit rows
    • DB2? export
    • DB2 with
    • DB2 Create Table
    • DB2 case statement
    • DB2 CAST
    • DB2 Functions
    • DB2 Date Functions
    • DB2? row_number
    • DB2 trim
    • DB2? Translate
    • DB2 UNION
    • DB2 timestamp
    • DB2? TIMESTAMPDIFF
    • DB2? replace
    • DB2 merge
    • DB2 COALESCE
    • DB2 ISNULL
    • DB2? explain
    • DB2 Join
    • DB2 alter column
    • DB2 rename column
    • DB2? Describe Table
    • DB2? rename table
    • DB2 List Databases
    • DB2 LUW
    • DB2 Query
    • DB2 GROUP BY
    • DB2 TO_DATE
    • View Serializability in DBMS
    • MariaDB Join
    • MariaDB JSON
    • MariaDB? show databases
    • Dataset Normalization
    • MariaDB Max Connections
    • jdbc connection
    • MariaDB GUI
  • DBMS
    • Introduction To DBMS
    • DBMS ER Diagram
    • What is DBMS?
    • DBMS Canonical Cover
    • DBMS Log-Based Recovery
    • DBMS Multivalued Dependency
    • Netezza Database
    • DBMS Concepts
    • DBMS Constraints
    • DBMS_Scheduler
    • B+ Tree in DBMS
    • DBMS_LOB
    • dbms entity
    • DBMS Foreign Key
    • DBMS Users
    • DBMS_Metadata.get_ddl
    • Relational Algebra in DBMS
    • DBMS Components
    • DBMS Features
    • DBMS Models
    • DBMS Relational Model
    • Hashing in DBMS
    • DBMS network model
    • Relationship in DBMS
    • ER Model in DBMS
    • Data Models in DBMS
    • Static Hashing in DBMS
    • Advantages of DBMS
    • dbms_output.put_line
    • DBMS Data Dictionary
    • dbms_xplan.display_cursor
    • Normal Forms in DBMS
    • DBMS helps achieve
    • DBMS 3 tier Architecture
    • Relational Calculus in DBMS
    • Serializability in DBMS
    • File Organization in DBMS
    • DBMS Transaction Processing
    • States of Transaction in DBMS
    • Functional Dependency in DBMS
    • Generalization in DBMS
    • Data Independence in DBMS
    • Lock Based Protocols in DBMS
    • Deadlock in DBMS
    • Integrity Constraints in DBMS
    • Concurrency Control in DBMS
    • Validation Based Protocol in DBMS
    • DBMS Locks
    • Normalization in DBMS
    • Transaction Property in DBMS
    • Specialization in DBMS
    • Aggregation in DBMS
    • Types of DBMS

Related Courses

SQL Certification Course

PL/SQL Certification Course

Oracle Certification Course

Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Database Management
  • Machine Learning
  • All Tutorials
Certification Courses
  • All Courses
  • Data Science Course - All in One Bundle
  • Machine Learning Course
  • Hadoop Certification Training
  • Cloud Computing Training Course
  • R Programming Course
  • AWS Training Course
  • SAS Training Course

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Data Science Course

Hadoop, Data Science, Statistics & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Data Science Course

Hadoop, Data Science, Statistics & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Special Offer - SQL Certification Course Learn More