Find Jobs
Hire Freelancers

Spart Developer Needed

₹600-1500 INR

Closed
Posted over 6 years ago

₹600-1500 INR

Paid on delivery
Hi, I want Spark developer, who will write the small programs to retrieve the results from dataset
Project ID: 15905088

About the project

11 proposals
Remote project
Active 6 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
11 freelancers are bidding on average ₹1,568 INR for this job
User Avatar
Java/J2EE/Struts/Spring/JPA Scala/Python (pyspark, pykafka)/Shell Hadoop/Spark/Kafka/Cassandra/Hive/HBase ELK
₹1,750 INR in 1 day
5.0 (44 reviews)
5.4
5.4
User Avatar
I am a data scientist and have experience with Big Data technologies like Spark and Hadoop. I also have experience with machine learning and statistical analysis of data using R and Python. Please provide more details about your project. I would like to do it.
₹4,500 INR in 1 day
5.0 (13 reviews)
4.9
4.9
User Avatar
I can help you do your spark work. Relevant Skills and Experience Spark , scala
₹750 INR in 1 day
0.0 (0 reviews)
0.0
0.0
User Avatar
• Having 5 Years of total experience in Software Design and Development. • Having workingexperience in Hadoop Ecosystem like Apache Storm, Flume, Kafka, Orient DB, Elastic Search,HDFS, Mapreduce, HBase, Hive, Sqoop etc. • Having hands-on experience with Spark with Machine learning(ML). • Having working experience in Core Java, J2EE, JSP, Servlet, Struts 1.x, Spring MVC, Hibernate Framework and JDBC. • Experience in Educational, E-commerce and Banking Domains. • Hands on experience in developing end-to-end solution of software products/Web Applications from requirement analysis to system study, designing, coding, unit testing, de-bugging, documentation and implementation.
₹1,300 INR in 1 day
0.0 (0 reviews)
0.0
0.0
User Avatar
A proposal has not yet been provided
₹1,250 INR in 3 days
0.0 (0 reviews)
0.0
0.0
User Avatar
•Having 3+ years of Hadoop Development - managing and tuning Hadoop cluster and its ecosystem components ( Hive, Pig, Spark, HDFS, Sqoop, Flume, Zookeeper, Kafka, Yarn and Oozie). •Excellent knowledge on Hadoop ecosystems such as HDFS, ResourceManager, NodeManager, Name Node, Data Node and Map Reduce programming paradigm. •Expertise in SparkSQL and knowledge on Spark Architecture. •Good working experience on Spark using Scala to compare the performance of Spark with Hive and SQL. •Extensive Experience in Setting Hadoop Cluster. Good working knowledge with Hive, Pig and MapReduce. •Experience in Oozie and workflow scheduler to manage Hadoop jobs by Direct Acyclic Graph (DAG) of actions with control flows. •Good understanding of NoSQL databases and hands on work experience in writing applications on NoSQL databases like HBase and MongoDB. •Strong SQL and database knowledge.
₹1,300 INR in 1 day
0.0 (0 reviews)
0.0
0.0

About the client

Flag of INDIA
Pune, India
5.0
7
Member since Feb 5, 2015

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.