Find Jobs
Hire Freelancers

POC - Hadoop, HDFS, Kafka or Spark

₹1500-12500 INR

Closed
Posted 5 months ago

₹1500-12500 INR

Paid on delivery
I am looking for a freelancer to help me with a Proof of Concept (POC) project focusing on Hadoop. Requirement: We drop a file in HDFS, which is then pushed to Spark or Kafka and it pushes final output/results into a database. Objective is to show we can handle million of records as input and put it in destination. The POC should be completed within 3-4 days and should have a simple level of complexity. Skills and experience required: - Strong knowledge and experience with Hadoop - Familiarity with HDFS and Kafka/Spark - Ability to quickly understand and implement a simple POC project - Good problem-solving skills and attention to detail
Project ID: 37543322

About the project

9 proposals
Remote project
Active 4 mos ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
9 freelancers are bidding on average ₹14,111 INR for this job
User Avatar
Experienced data engineer certified in spark and experienced in Kafka, can work as per requirements.
₹10,000 INR in 4 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Have real time experience on spark with kafka. If you are seeking a dedicated and experienced Big Data and Spark freelancer who can bring a wealth of knowledge to your projects, I am eager to discuss how I can contribute to your success. Let's connect and explore the possibilities of working together.
₹10,000 INR in 4 days
0.0 (0 reviews)
0.0
0.0
User Avatar
I am a Solutions Architect specialized in Data Analytics and Big Data with a strong background in the design and development of batch, streaming and event-driven applications. My specialties include: - Kafka, - Spark, - Flink, - Hadoop, - Data Streaming - Designing and prototyping data analytics and dashboard apps. I can assist you with architecture and POC of big data systems using multiple platforms such as Hadoop, Databricks, AWS, and Azure. Feel free to DM me if you would like to discuss the project details.
₹12,000 INR in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hi, I have reviewed the project details and understood your requirement of a Data Engineer who can assist you SPark Data Pipelines. I am capable of completing this task in a timely and cost-effective manner. Let's connect and discussion more on it. Short Into about Me: Currently working in one of the Big4 Companies and solving Big Data Business problem using AWS Cloud and Pyspark Engine. I have expertise on the following Tools/Technologies: - Apache Spark (PySpark), Hadoop, HDFS - Python3, unit test script, Numpy, Pandas - Apache Kafka, Spark-Streaming, Glue Streaming - AWS Cloud Services (Lambda, EMR, DynamoDB, S3, Glue Lambda, EBS, Athena, IAM, Step function, EC2, ECS, secret manager (kms), s3, sns, ses, sqssqs, cloudwatch etc) - Creation of cloud formation templates for various AWS Services - CI/CD Pipelines - SQL/MySQL - Git, GitHub, BitBucket - Big Data Pipelines - IBM DB2 - MS Excel, MS Office - PyCharm, VSCode, Databricks, Jupyter Notebook, MS Excel, Putty Please provide more details if it's more than what you have given in the project details. Thank you! Thanks & Regards, Prakash
₹10,000 INR in 1 day
0.0 (0 reviews)
0.0
0.0
User Avatar
I have 5 years of real-time experience related to Big Data, and have worked on similar requirements before.
₹9,000 INR in 5 days
0.0 (0 reviews)
0.0
0.0
User Avatar
As a seasoned Hadoop administrator, I bring a wealth of expertise in managing and optimizing large-scale data infrastructure. With a proven track record of successfully overseeing Hadoop clusters, I excel in implementing best practices to ensure seamless data processing and storage. My in-depth knowledge of Hadoop ecosystem components, such as HDFS, MapReduce, and YARN, empowers me to fine-tune performance and troubleshoot issues efficiently. I am adept at crafting robust security measures to safeguard sensitive data, implementing backup and recovery strategies, and staying abreast of the latest advancements in big data technologies. My hands-on experience with Hadoop distributions like Cloudera and Hortonworks positions me as a reliable professional capable of adapting to diverse project requirements. I am committed to delivering high-quality solutions, streamlining workflows, and minimizing downtime through proactive system monitoring. With excellent communication skills, I collaborate seamlessly with cross-functional teams, translating technical details into accessible insights for stakeholders. My dedication to staying ahead of industry trends and my commitment to continuous learning make me a valuable asset for any freelancer project seeking a proficient Hadoop administrator. Choose me, and you'll benefit from my passion for excellence and a results-driven approach to Hadoop administration.
₹50,000 INR in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
I have more than 10 years of experience in big data. I can help you process large amounts of data, whether deployed directly within a Kubernetes or Docker Swarm cluster or on services like GCP Dataproc, AWS EMR, AWS Glue, or Azure Synapse.
₹7,000 INR in 7 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of INDIA
New Delhi, India
5.0
25
Payment method verified
Member since Dec 18, 2014

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.