Find Jobs
Hire Freelancers

fixing mega to google drive python code to download via google colab

₹600-1500 INR

Closed
Posted about 1 year ago

₹600-1500 INR

Paid on delivery
Hi, I have been using the following colab notebook to transfer files from mega to google drive. I can download upto 5gb to use free download quota but I cant login to my mega drive to use its bandwidth since today it is showing an error. If someone can fix it it would be great. . here is the colab notebook: [login to view URL] Thanks
Project ID: 36183915

About the project

4 proposals
Remote project
Active 1 yr ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
4 freelancers are bidding on average ₹970 INR for this job
User Avatar
I can help to solve the error. Greetings, I have gone through your project description. I find myself as a perfect fit for this job. I am working as a Python Developer from last 2 year. Some of my expertise is in the fields: 1. Web Scraping/Web Automation - Selenium, Scrapy, Requests, Beautifulsoup 2. AI and ML 3. Web Designing 4. Wordpress 5. Data Science 6. C/C++ 7. SQL I will be available 24/7 to assist you during the project. So lets discuss more about it over chat. Yours Faithfully, Jaibhan Singh Gaur,
₹1,000 INR in 2 days
5.0 (64 reviews)
5.1
5.1
User Avatar
Hello I am a professional python developer. My main specializations are automation, web scrapers and bots development. I have already developed over 200 scrapers. From the simplest (for example, a competitor's price collector) to complex parsers (with authorization, bypassing captcha, rotating ips and others) which can collect millions of products from amazon. I have done web scrapers for: - Amazon - Instagram - Facebook - Google - Twitter - LinkedIn - Pinterest - Walmart - And many others For scraping I use: - Python - Requests - BeautifulSoup - Selenium - Scrapy - Pyautogui - Undetected Chromedriver - Rotating ips I can bypass: - CloudFlare - IP blocking - Captcha - Authorization required - Other limitations Django / PostgreSQL For big scraping projects I usually use Django with PostgreSQL. This allows us to store information in a database for further processing and use. I also set up an administration area which allows us to check the data and set up scraper configs. If you need a professional solution in this area - I am ready to cooperate. I am ready to make a sample script before we start Regards, Oleg
₹600 INR in 7 days
4.8 (6 reviews)
3.6
3.6
User Avatar
Hi, We are a team of developers and artists with over 6 years of experience in the industry. We have already done multiple similar projects and we are certain that we can finish this project in no time. Please leave a message for more info and samples. Thank you. - FusionFacet Team
₹1,000 INR in 1 day
0.0 (0 reviews)
0.0
0.0
User Avatar
I have a range of skills that allow me to solve complex problems in the field of software development. Specifically, I have expertise in PHP, JavaScript, and Python, which provide a strong foundation in web development and scripting. In addition, my knowledge of computer architecture enables me to understand the underlying hardware and optimize software performance accordingly. I believe that my diverse skill set allows me to troubleshoot technical issues effectively and develop innovative solutions that integrate multiple technologies.
₹1,280 INR in 2 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of INDIA
Kochi, India
5.0
9
Payment method verified
Member since Jan 4, 2023

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.