Find Jobs
Hire Freelancers

System Admin required for Ubuntu / AWS CLI data offloading (backup) solution

$250-750 AUD

Closed
Posted about 4 years ago

$250-750 AUD

Paid on delivery
I need a GUN Ubuntu System Admin with AWESOME AWS expertise. You are probably going to be a Centos freak as well - and I'll appreciate that too (for potential ongoing work). If you don't know how to write insane scripts and you are not all over the AWS CLI then DONT. BOTHER. PITCHING. If you don't know what IAM Users are and how to craft clean policies, then log off freelancer now and consider your life choices. If you can collaborate with me during 11am to 6pm ADST/AEST, then please make that clear in your pitch. (I am based in Melbourne and prefer Slack to communicate and collaborate in real-time - does anyone email anymore?). I run a managed services agency, and I want to update the way we perform disaster recovery offsite data storage. This will be developed on one production server, and we will roll it out to the remaining servers. I need a script, or set of scripts that will iterate through a configured set of directories and do the following: target_directory=/home/user1/:/home/user2 backup_staging=/backups s3_bucket=s3://bucket server_ref=prod01 notification_email=myticketingsystem@mydomain. [login to view URL] Iterarate through all subdirectories in target_directory and create a corresponding [login to view URL] archive including date/time in the filename in backup_staging. For example, /users/fred/www.fred. com would be archived to /backups/fred/www.fred. com/www.fred. [login to view URL] During the archiving process, the .git directory in the subdirectory root MUST NOT be included. Once the archive is created, the script should use AWS CLI to copy the archive to the bucket, creating a directory structure if required to mirror inside the backups directory but using the server_ref first . i.e. s3://bucket/server_ref/[login to view URL][archive] Once the archive is sent to S3, any archives older than 45 days should be removed from S3. (this can be done via policy if possible/preferred). Once the archive is CONFIRMED to have been copied to S3, the local archive should be removed - ie /backups/fred/www.fred. com/www.fred. [login to view URL] The script should then process the NEXT subdirectory in the target directory. This means one archive is created, offloaded and removed from the staging area before the next one is processed. Once the first target_dir is processed, the next one should be processed. If an error occurs at any time, an email should be sent. Once the offloading is complete for all sites, an email should be sent. The script should report on how long it took to complete when reporting completion email. The script should be set to run at 2am each morning. You should tell me what is good/bad/indifferent about my approach (bearing in mind this will be running on 16 servers) and help me craft something even better (if that's at all possible). Stage 2 of this will be to configure a VM (somewhere) with x TB storage to use AWS CLI Sync to copy down ALL server backups from the S3 storage to a local storage device. Tell me in your pitch if you know whether "aws sync" deletes files in the destination location or reflects changes in the destination back on the source. You will need to sign an NDA before starting on this. Also, let me know your favourite brand of coffee and what you thought of The Expanse.
Project ID: 24130607

About the project

10 proposals
Remote project
Active 4 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
10 freelancers are bidding on average $494 AUD for this job
User Avatar
Hello, I have read your project REQUIREMENT and UNDERSTOOD IT COMPLETELY. We deal with the ALL TYPE OF SERVER RELATED ISSUE as we have expert team for this. We will be able to do but we need proper details regarding your projects. We are the leading IT Company in Indore, India. We have huge experience and mastery in app development, app designing and developing backend, cloud services, and enterprise solutions. We are a team of certified Amazon solution architects. We are also part of Amazon APN partner network. We do provide all kinds of solution related to Network administration or for system admin, Linux, Unix, Apache, servers, lambda EC2, Open SSL. We can provide you solution for all your customized needs. Please check our recent works:- Need AWS Setup for an ASP.NET Application AWS | EC2 | RDS | BEANSTALK | CLOUD FORMATION https://www.freelancer.com/projects/Amazon-Web-Services/Lambda-setup-push-cloudwatch-logs/ https://www.freelancer.com/projects/Cloud-Computing/Setup-New-Azure-Cloud/ https://www.freelancer.com/projects/Amazon-Web-Services/Push-AWS-RDS-psql-bucket/ https://www.freelancer.in/projects/Amazon-Web-Services/Need-AWS-Setup-for-ASP/ https://www.freelancer.com/projects/Linux/FIX-OPENVPN-IPTABLES-CENTOS/ https://www.freelancer.com/projects/nodejs/code-deployment-heroku-AWS/ https://www.freelancer.in/projects/Amazon-Web-Services/Need-AWS-Setup-for-ASP/ I am sure above example will provide you the quality of work. Thanks and Regards Deepak
$500 AUD in 7 days
4.9 (69 reviews)
6.9
6.9
User Avatar
HI, I understand you requirement and can fulfill it as I m a DevOps engineer with a profound experience of 6 + years in relative field Offered services in DevOps, sysOps, AWS/ GCP/ Cloud Linux server administration, Apache/Nginx Webserver automation, GIT- Jenkins integration,Mysql,Mongodb, mariadb, Ansible / AWX scripting, Virtualizer, Dockers, Kubernetes, VPN setup, OpenVPN, SIP, Astrisk, PSTN, Voip Please initiate a chat session for further project discussion.
$500 AUD in 2 days
4.8 (51 reviews)
6.4
6.4
User Avatar
***AWS EXPERT*** Hi, Hope you are doing great !! I have major work experience in Server Administration and Project Management. AWS Services : EC2, S3, RDS, CloudFront and many more.I provide all kinds of solution related to Network administration for system admin, Linux, Unix, Apache, servers, lambda EC2, Open SSL. I am grateful for your time and consideration,and I look forward to speaking with you further about this position.I am willing to work to work for 40 hrs per week for your project if you hire me once Warm Regards, Ranu
$500 AUD in 7 days
4.8 (36 reviews)
5.2
5.2
User Avatar
Hire me. I've done such bash script to backup files (Not for s3 but for remote FTP, but not a prob to do for s3) I can do exactly what you wanted. I'm a Server & Linux System Admin, AWS, Google Cloud & Microsoft Azure Cloud Expert, Email Server & DNS Expert, Web Security Analyst & Website Performance Optimizer, Web Designer. "aws sync" doesn't delete files in the destination location, it reflects changes in the source to the destination. My favourite brand of coffee is Nescafe. Yes, I can collaborate with you during 11am to 6pm ADST/AEST. & for a linux guy Ubuntu or CentOS doesn't matter. Btw, I do work on both.
$300 AUD in 3 days
4.5 (71 reviews)
5.1
5.1
User Avatar
Hi mate, I hope you are doing great after writing that project description. Best one I have read in ages, better than the ones I receive from "team of consultants". I quickly reviewed my life choices and still think can bid. Slack is fine, however, I will tend to keep the collaboration between 11 am to 6 pm to a minimum, that's my office time too (based in Melbourne). I'll be working on the project in after-hours mostly but can share an update in morning window. * Stage 1 can be handled in 1 script, wouldn't be that complex to maintain. However, you prefer bash only? I was also thinking ansible, portable, maintainable, however, that means you'll need ansible installed. your choice. * Reporting completion time or state in the email can be done via SNS however need to deploy supporting infra as CloudFormation for S3, SNS, IAM Role/Policies involved. * Agree on S3 Lifecycle policy is the best choice for archiving older files. * I can provide you with a script to setup cron, however in case of Server rebuilt you'll have to ensure it gets executed. Can guide on User Data update * When we'll get to Stage 2 there can be other options e.g. S3 Replication etc. But as far as sync is concerned it does not modify source bucket and does not delete anything in destination bucket, only updates modified files (file size, or timestamp) in the destination or creates new. * I am currently enjoying Mocha with almond milk and your favorite is on me upon project completion. Regards, Salman
$500 AUD in 7 days
5.0 (13 reviews)
4.1
4.1
User Avatar
Nice to meet you I am an Amazon Cloud Architect for the web infrastructure serving 90 million page impressions and 12 TB Internet traffic per month. The AWS services I use are EC2, ELB, MySQL RDS, VPC, CloudFront, ElastiCache, CloudWatch, CloudFormation, OpsWorks, ElasticBeanstalk, CodeDeploy, S3, SES, SQS and SNS. I have 20 years of Linux SysAdmin experience. I currently use Apache, Nginx, Ldirectord, MySQL, Perl, PHP, Memcached, Sphinx, Bind, Typo3, WordPress, Send-mail, Postfix, NFS, Samba, Snort, Vsftpd, aide, Nagios, Cacti, Puppet and a bunch of other traditional Linux software. I am good at amazon-web-services,bash-scripting,linux,system-admin,ubuntu If you’re looking for a developer that’s truly an expert, driven by passion, not afraid to take on a challenge, and will be there with you every step of the way then look no further as I’m your guy.
$419 AUD in 5 days
4.9 (7 reviews)
3.9
3.9
User Avatar
I guess I'm sort of breaking your rules about not bidding, but here goes: GNU/Linux admin - check AWS - check I prefer Debian over CentOS :/ I'm good enough with aws cli, but I need to look at the docs some times 11am to 6pm ADST/AEST works for me Your approach: it's simple, which is a good thing, but it also likely wastes disk space and CPU resources. An incremental/differential system could be more efficient, but it depends on the data being backed up. We can come up with the ideal plan when we discuss NDA: I'm fine with that Starbucks latte, and I haven't gotten around to watching The Expanse yet. I would love to help you with this project. Please message me so we can discuss
$500 AUD in 7 days
5.0 (1 review)
2.0
2.0
User Avatar
Hello. This is Viky Bansod. I am working in DevOps and Cloud Operations. Total experience is 6+ yrs along with working on Ubuntu 14.04,16.04, Redhat and AWS Linux. I have a certification in AWS(Solution Architecture Associate) I can provide you the script for the backup. and pushing it S3. also can provide you the data backup lifecycle for moving from local server to s3 and the delete after 45days. For more discussion, please message me. About Coffee, its CCD Brand(CAFE COFFEY DAY) AND if you talking about expanse series on Amazon. I haven't watched it yet :) Thanks Viky Bansod
$666 AUD in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hello, I am a certified AWS engineer (on all 3 level of exams) and have many years of experience on working on AWS and scripting bash/python/boto3 and many more. I have done many work with clients via slack and looking forward to work with you as well. If you are interested, kindly PM me to discuss on this. Many thanks. Harsha Rathnayaka
$555 AUD in 4 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of AUSTRALIA
Frankston, Australia
5.0
36
Payment method verified
Member since Mar 7, 2013

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.