Cloudera resource allocation optimization for HDFS/Spark environment
$30-250 USD
Closed
Posted almost 7 years ago
$30-250 USD
Paid on delivery
Simple & quick job for an expert - we need to optimize performance through configuration of server environment resource utilization (mem and CPU) on nodes for a recently installed environment utilizing cloudera 5.6 & 5.10
Hi,
I have experience with spark development and cluster config optimization. Usually configuration depends on type of workload (size of the files, processing type, etc)
So, to have a successful optimization is good to define a test scenario and target performance metrics
Let me know please if you want to discuss the details
Regards,
Mykhail
I am a big data and machine learning developer in a private firm having good knowledge in big data tools and technologies including cloudera and amazon ec2. Send me message for further discussion.
I am certified Hadoop Administrator and have around 6.5 Years of experience in design, development, testing and support in open source technologies like Hadoop, HDFS, Pig, Hive, NiFi, MongoDB, Sqoop, CORE JAVA. I can help you with this.
My name is Narsireddy, I am a Big Data expert having more than 10 years of IT experience with Hadoop ,Spark ,Scala,Mongo DB and Java,Kafka.
I have multiple bigdata project architecture experience including data lake and ETL implementations.
-My last 4 projects are into Hadoop and Spark.
-Currently working on Kafka and Spark streaming to process the fitbit data
-My next assignment would be processing EMR data using Spark.
- I have strong hands on spark and scala and Java.