Spark

Spark jobs detail with UI visualization.
Hope you are doing well. It means that the task had 4 stages means to complete this task it travelled through 4 stages. Please kind...
Sat, 2 Apr, 2016 at 11:25 AM
Issue in VM - Apache Spark and Scala
Dear Herin, Hope you are doing great! This is just a warning that you are getting because of missing packages.You can ignore this and ...
Sat, 2 Apr, 2016 at 11:24 AM
Spark installation on windows.
  Hope you are doing well. Please kindly check the attachment for the guide document. I hope this will help you. If you face any iss...
Sat, 26 Mar, 2016 at 2:31 PM
Spark sbt eclipse(how to run spark program in eclipse?).
 Hope you are doing well. Please kindly check the attachment for the guide document. Try to follow the steps if you face any further issue kindl...
Sat, 26 Mar, 2016 at 2:34 PM
Realtek PCIe FE Family Controller.
Hope you are doing well. Click on "change network settings". And then in network make the first one as NAT and another one HO...
Sat, 16 Apr, 2016 at 1:19 PM
spark program to replace mapreduce
Hello Sunil, Hope you are doing well. Map in hadoop: public class LineLengthMapper     extends Mapper<LongWritable,Text,In...
Sat, 2 Apr, 2016 at 11:23 AM
A java Runtime Environment (JRE) or Java Development kit (JDK) must be available in order to run Eclipse. No Java virtual machine was found after searching the following locations:
  Hope you are doing well. Please kindly follow the steps. Make sure that jdk is installed in your system. Now click on advance set...
Tue, 29 Mar, 2016 at 1:35 PM
Apache spark and scala reading material
Hello Karthik, Hope you are doing well. Please kindly check the attachment You can also study from edureka blogs. ...
Sat, 2 Apr, 2016 at 11:20 AM
Hbase using Scala
Hello Kiran, Hope you are doing well. Please kindly check the attachment for the code snippet.
Mon, 18 Apr, 2016 at 5:02 PM
Spark Hive ERROR XSDB6: Another instance of Derby may have already booted the database /home/edureka/metastore_db
It seems that you have started hive in one terminal, and in another you are trying spark+hive. This is the reason you are facing this issue. ...
Wed, 30 Mar, 2016 at 6:41 AM