once we get data back by running a hive query, the result is always relatively small, so let’s dump it out to hbase. drag one component called hbaseoutput , and config the zookeeper info. that’s what you need to know about the hbase. ( it will figure out which hmaster is active, and which region server to serve the storage.)
Here table will be mapped to keyspace, and we create one random key, both two columns will be mapped the “profile” column family.
click advanced settings tabl to add this cf to the metadata
7 rows are insterted to the hbase and the table was created by talend also. and data is there!
So we did a simple big data etl job, load data from DB to hdfs, then write hive query to do the analysis, and result is pushed to hbase for application query.
hope this helps.
2 comments:
Can we use a column for defining then custom key?
(String)globalMap.get("row2.memberid")
Hi ,
Thanks for the post , i am facing wearied issue with kerberos authentication
. I need to pass keytab bt i don't know from where to get keytab.
Caused by: java.io.IOException: Login failure for abc@abc.COM from keytab /etc/security/keytabs/abc.service.keytab: javax.security.auth.login.LoginException: Unable to obtain password from user.
Really appreciated for your help in advance .
Post a Comment