Giter VIP home page Giter VIP logo

code's People

Contributors

dgadiraju avatar rajeshnutalapati avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

code's Issues

Setup Spark1.2.1 on quickstartVM

image

i am not seeing "hive-site.xml -> /etc/hive/conf/hive.site.xml"

after that i am not able to follow the steps to setup.
please help me to resolve this issue.

Hi durga, How can I resolve this error ?

from pyspark.sql import SQLContext
sqlContext = SQLContext(sc)
jdbcurl = "jdbc:mysql://quickstart.cloudera:3306/retail_db?user=retail_dba&password=cloudera"
df = sqlContext.load(source="jdbc", url=jdbcurl, dbtable="departments")
Traceback (most recent call last):
File "", line 1, in
AttributeError: 'SQLContext' object has no attribute 'load'
@dgadiraju

EDW PIG Issue

odistinct = DISTINCT orderstatus.order_status

Above statement will not work as DISTINCT operator will work on a relation but not on fields

Performance improvements issue using CombineTextInputFormat with many small files

Hi dgadiraju,
I have tried your code for testing performance improvements using small files on local machine.

Scenario: I have a 100 small files ~30MB each.
Case 1: When map-reduce job was executed with these 100 files rowcount.java, 100 mappers completed the job in ~10 minutes.

Case 2: When the job was executed using RowCountCombinedFileInputFormat.java, 25 mappers were created and the job took ~33 minutes to complete.
I haven't noticed the performance improvements using CombineTextInputFormat.
What may be the possible issues that causing this?

Not able to do git clone setup please help me out in this.

hi Durga,

I'm trying to do git clone from google in the same way which you have shown in video but its not working.

kindly help me out in this. PFB the error which m getting while running below commond and m running it as root.

git clone http://code.google.com/p/parallel-ssh/

Error ๐Ÿ‘
Initialized empty Git repository in /root/parallel-ssh/.git/
fatal: http://code.google.com/p/parallel-ssh//info/refs not found: did you run git update-server-info on the server?

Thanks in advanced!!!

Thanks
Pawan Giri

Need help in configuring ambari.repo

Hi Durga,

I tried to configure "ambari.repo" file and after completion when i tried to test the same by running yum repolist, its throwing some error mention as below.
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile

https://access.redhat.com/articles/1320623

If above article doesn't help to resolve this issue please open a ticket with Red Hat Support.

http://hdpserver.techmahindra.com/yum/AMBARI-2.2.2.0/centos6/repodata/repomd.xml: [Errno 14] PYCURL ERROR 22 - "The requested URL returned error: 404 Not Found"
Trying other mirror.
repo id repo name status
HDP-2.3.4.0 HDP Version - HDP-2.3.4.0 175
HDP-UTILS-1.1.0.20 HDP Utils Version - HDP-UTILS-1.1.0.20 43
Updates-ambari-2.2.2.0 ambari-2.2.2.0 - Updates 0
base CentOS-6 - Base 6,696
extras CentOS-6 - Extras 62
updates CentOS-6 - Updates 281
repolist: 7,257
[root@hdpserver yum.repos.d]#

Please see the below i have configure path:

VERSION_NUMBER=2.2.2.0-460

[Updates-ambari-2.2.2.0]
name=ambari-2.2.2.0 - Updates
baseurl=http://hdpserver.techmahindra.com/yum/AMBARI-2.2.2.0/centos6
gpgcheck=1
gpgkey=http://hdpserver.techmahindra.com/yum/AMBARI-2.2.2.0/centos6/RPM-GPG-KEY/RPM-GPG-KEY=Jenkins
enabled=1
priority=1

here in my system if you go for gpkey it is as below path i checked with both but every time m getting the same error.

kindly help me to understand and resolve this issue.

Thanks
Pawan Giri

sqoop import for orders table

hi durga,
i am facing error while trying to import orders table from retail_db database in mysql .

[testsandbox@sandbox ~]$ sqoop import --connect jdbc:mysql://sandbox.hortonworks.com/retail_db \

--table orders
--fields-terminated-by '^A'
--lines-terminated-by '\n'
--hive-home /user/hive/warehouse
--hive-import
--hive-table orders
--hive-overwrite
--username retail_dba
--password hadoop
Warning: /usr/hdp/2.4.0.0-169/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/08/24 08:49:17 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.0.0-169
16/08/24 08:49:17 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/24 08:49:17 WARN sqoop.SqoopOptions: Character argument ^A has multiple characters; only the first will be used.
16/08/24 08:49:19 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/24 08:49:19 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.0.0-169/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.0.0-169/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/24 08:49:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1
16/08/24 08:49:20 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@9fa6604 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@9fa6604 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
16/08/24 08:49:20 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
i got same message when used import-all .

note that all other tables has been imported except orders.

kindly suggest.

thanks,
rishit Shah

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.