Data ingestion jobs

Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    448 data ingestion jobs found, pricing in SGD

    I'm seeking a proficient developer to build a comprehensive demo or starter code integrating Snowflake and Kafka using AWS cloud. The ideal candidate should have deep understanding of concepts such as: - Data ingestion - Data transformation - Data storage and retrieval The code should ensure: - Optimal performance - Scalability - Security The code can be written in Python or Java. In-depth knowledge of these languages, as well as experience with Snowflake and Kafka integration, is a necessity for a successful bid on this project.

    $18 (Avg Bid)
    $18 Avg Bid
    4 bids

    Need assistance with the development of a well-structured, efficient data pipeline. This system is expected to integrate a range of data sources, including APIs, DB views and SFTP, and prepare them for backend ingestion in CSV format. Key Responsibilities: - Source data from APIs, DB views, and SFTP - Transform ingested data into CSV format Ideal Skills and Experience: - Expertise in creating and managing data pipelines - Proficiency in various data integration methods, specifically APIs, DB views, and SFTP - Excellent knowledge of data transformation and loading processes Please note: While optional transformations like data filtering, aggregation, and mapping are not explicitly required, the ideal candidate should be able ...

    $19 / hr (Avg Bid)
    $19 / hr Avg Bid
    8 bids

    As the professional handling this project, you'll engage with big data exceeding 10GB. Proficiency in Python, Java, and Pyspark are vital for success as we demand expertise in: - Data ingestion and extraction: The role involves managing complex datasets and running ETL operations. - Data transformation and cleaning: You'll also need to audit the data for quality and cleanse it for accuracy, ensuring integrity throughout the system. - Handling Streaming pipelines and Delta Live Tables: Mastery of these could be game-changing in our pipelines, facilitating the real-time analysis of data.

    $26 / hr (Avg Bid)
    $26 / hr Avg Bid
    35 bids

    ...protocol of a list of sites to augment the LLM. Key Criteria for Success: Robust technical execution that meets core requirements Modular, extensible architecture for flexibility over time Quality, well-tested code delivered on time Clear, prompt communication throughout project Key Components: Text/file ingestion and vectorization Semantic similarity search in Milvus/SPTAG Local post-hoc model explanations Production-ready Flask/FastAPI UI Documentation and usage guides Document Ingestion (spaCy, Beautiful Soup, PyMuPDF): Scrape/ingest docs, extract text, handle formats Vectorization (Sentence/Doc2Vec): Convert text to distributed representations Vector Database (Milvus, SPTAG): Store/index vectors at scale for retrieval Semantic Search (FAISS): Leverage ANN for related...

    $1473 (Avg Bid)
    $1473 Avg Bid
    61 bids

    ...Conduct a comprehensive assessment of Luxetrip's business requirements, target market, and competitive landscape to inform the platform architecture and roadmap 2. Define the key components, modules, and interfaces of Luxetrip's AI-powered travel platform, including recommendation engines, pricing algorithms, and personalization frameworks 3. Design and develop the platform's data infrastructure, including data ingestion, storage, processing, and analytics capabilities 4. Architect and implement the platform's microservices-based backend, ensuring scalability, reliability, and security 5. Collaborate with the front-end development team to define and implement the platform's user experience and interface, optimizing for conversion and eng...

    $2175 (Avg Bid)
    NDA
    $2175 Avg Bid
    97 bids

    I am looking for a skilled data engineer to assist with a project encompassing data ingestion and storage, transformation and processing, as well as ensuring adherence to data governance and compliance standards. Key Responsibilities: - Implementing real-time data streaming and batch data processing for the ingestion and storage of data. - Leveraging cloud-based data storage solutions, particularly within the Azure framework, to ensure scalability and reliability. Data governance and data protection Desired Outcomes: - Data transformation and processing should focus on efficient and effective data cleaning and normalization. - Further, data aggregation and summarization should help in derivi...

    $990 (Avg Bid)
    $990 Avg Bid
    7 bids

    I am in need of a data engineer with strong design architecture skills to create a databrics UI UX prototype for the primary purpose of processing flat and text files. Steps - create a generic pipeline that can process data files from multiple formats and frequency upto 1 tb/day. Create the design in databrics UI. There should be multiple approaches/architectures to design the requirements. This project specifically targets data engineers, requiring the design to enhance how they interact with data and improve efficiency in their daily tasks. Your role will encompass: - Designing an interface that simplifies the data ingestion process - Crafting intuitive tools for effective data transformation - Integrating seamless data integr...

    $194 (Avg Bid)
    $194 Avg Bid
    12 bids

    ...stakeholders to gather and understand visualization and analytics requirements for the platform. • Design a comprehensive cloud-native architecture for the visualization and analytics platform, encompassing data ingestion, processing, storage, visualization, and analytics components. • Leverage your expertise to recommend open-source technologies, off-the-shelf software tools, or a combination of both to implement the chosen architecture. • Translate the overall architecture into a specific AWS cloud implementation plan. o Select appropriate AWS services for each component of the architecture (e.g., data storage, compute, analytics services, visualization tools). • Consider cost optimization strategies for the AWS cloud implementation • Iden...

    $476 (Avg Bid)
    $476 Avg Bid
    23 bids

    I'm searching for a PySpark expert who can provide assistance on optimizing and debugging current PySpark scripts. I am specifically focused on PySpark, so expertise in this area is crucial for...assistance on optimizing and debugging current PySpark scripts. I am specifically focused on PySpark, so expertise in this area is crucial for the successful completion of this project. Key Responsibilities: - Optimizing PySpark scripts to improve efficiency - Debugging current PySpark scripts to resolve existing issues Ideal Candidate: - Proficient with PySpark - Experience in big data management, data ingestion, processing, analysis, visualization, and reporting - Strong problem-solving skills to identify and resolve issues effectively - Knowledgeable in perfo...

    $138 (Avg Bid)
    $138 Avg Bid
    65 bids

    ...updated by reading data from a TXT file. The process must be automated via a script, filling in blank cells, distributed across all the sheets in the document. Key requirements include: 1. The ability to read and accurately interpret all fields from the TXT file. 2. Updated data should be posted to all sheets within the excel document, in the required cells. 3. In the case of data discrepancies within the TXT file, enable an automated resolution by defaulting to a specific value. Leveraging your expertise in data extraction, manipulation, and Excel automation will be instrumental for this project. Ideally, you are proficient in scripts suitable for handling such tasks, debatably the likes of Python or VBA, contingent on your recommendation. Familiarity ...

    $619 (Avg Bid)
    $619 Avg Bid
    28 bids

    ...only a few hours!! I'm seeking a Python developer with experience in data analysis and visualization to accomplish a beginner-level project. The primary feature will be the utilization of business data for analysis and processing. Key attributes of this project include: - **Skillsets Required** - Python programming, specifically for data analysis. - Experience with libraries such as Pandas, NumPy, Matplotlib and Seaborn. - Comprehension of business and financial data. - **Project Outcomes** - Ingestion and processing of hotel data. - Generation of interactive visual graphs from processed data. Please note, the ultimate expectation is to deliver a Python solution that can analyze, process, and transform data i...

    $20 (Avg Bid)
    $20 Avg Bid
    3 bids

    ...used cat litter, as cat feces may contain Toxoplasma gondii, a parasite that can cause toxoplasmosis. Pregnant or nursing women, as well as children and anyone with a weakened immune system, should avoid handling used cat litter. Always handle litter in a ventilated area to avoid inhalation. Keep contents and packaging out of reach of children and pets. Seek medical help in case of accidental ingestion. Ingredients: 75% Soy fiber, 22% Corn starch and 3% Guar gum About Enterprises Pty Ltd PO Box 328 Moorooka QLD 4105 MADE IN CHINA CONNECT WITH US (INSERT X, TIKTOK, INSTGRAM, FACEBOOK, YOUTUBE ICONS) REORDER (INSERT QR CODE LINKING TO - USE A QR CODE THAT LINKS DIRECTLY TO THE PAGE AND IS NOT REDIRECTED BY A THIRD PARTY)

    $178 (Avg Bid)
    Guaranteed
    $178
    57 entries

    ...Technologies**: - Select appropriate technologies for building the website and managing the database. - Consider using a web framework like Django or Flask for the website, and a database management system like PostgreSQL or MongoDB for the database. 4. **Develop Data Ingestion Pipeline**: - Implement a data ingestion pipeline to continuously fetch new AI tools from various sources. - This pipeline could involve web scraping, integration with APIs, or manual data entry, depending on the availability of data sources. 5. **Implement Indexing and Categorization**: - Develop algorithms to automatically categorize new AI tools based on their attributes. - Use natural language processing (NLP) techniques to analyze tool descript...

    $2154 (Avg Bid)
    $2154 Avg Bid
    76 bids

    ML Engineer Responsibilities - Develop all data processing logic, combinations, and pathways for data preparation. - Implement ML models for clustering for outlier detection & removal. - Data ingestion pipeline requirements - Design & implementation of data science side of the entire Data Preprocessing Funnel Requirements - Proven experience as an ML Engineer or similar role. - Proficient in Python & related ML tools. - Familiar with ML algos. - Knowledge of statistical basis of ML models. - Ability to work independently and as part of a team. - Excellent communication and problem-solving skills. - Attention to detail and quality.

    $1845 (Avg Bid)
    $1845 Avg Bid
    35 bids

    I am grasping for a chance to develop further my existing understanding of the Snowflake data warehousing platform. My primary area of interest is learning how to ingest data on this platform. I have an external Snowflake share with about 34 tables/views for this practice which is refreshed daily. I wish to create a permanent database in my account from the external share. Unfortunately, if there are edits to the shared data, they overwrite, the prior day. So I want to make sure my account has a history to catch changes I wish to automate the daily ingestion of this shared data. Once done, we can move on to making a second basic reporting database in my account which Power Bi and other tools can leverage. The tutorial will ideally include hands-on t...

    $27 / hr (Avg Bid)
    $27 / hr Avg Bid
    7 bids

    ...managing all enterprise documents efficiently. 2. Goals Automate document processing and categorization using AI. Enhance document search capabilities through Bing integration. Improve information accessibility and collaboration across the organization. Reduce administrative tasks associated with document management. Enhance document security and compliance. 3. Functional Requirements Document Ingestion: Support various document formats (e.g., PDF, Word, Excel, image). Automatic document import from various sources (e.g., email, scanner, file systems). Manual document upload with metadata tagging. Document Processing: Automatic text extraction using Optical Character Recognition (OCR). Content analysis and keyword identification using Natural Language Processing (NLP). Docume...

    $3246 (Avg Bid)
    $3246 Avg Bid
    71 bids

    I am looking for an expert in Grafana Stack and AWS CloudWatch, proficient in IoT-Core, Lambda, Kinesis, IoT SiteWise, and Amazon OpenSearch Service. The objective is dual: Data Ingestion: - Efficient and reliable ingestion of sensor data through Grafana or AWS IoT-Core and Kinesis. Data Visualisation: - Use Grafana or AWS IoT SiteWise to analyze and visualize sensor data. - Generation of insight-driven reports using Amazon OpenSearch Service would also be expected. Ideal skills: - Proficiency in Grafana or AWS IoT suite - Experience with data ingestion and visualization tools - Capabilities in Sensor data processing and analytics.

    $23 / hr (Avg Bid)
    $23 / hr Avg Bid
    23 bids

    ...you examples of the data. Example sources are Twitter, Slack, Quickbooks data from an API pull. All API are already setup and pulling the JSON files. You only need to setup the Pinecone database. Should Have: -Thorough knowledge of how Pinecone's vector indexing and querying mechanisms work. -Knowing how to use Pinecone's metadata handling and filtering features. -Experience with optimizing vector dimensions and choosing appropriate vector embeddings for JSON data. -Experienced in choosing and using the appropriate Pinecone similarity metrics for different datasets. -Knowledge in Pinecone's upsert and query operations for dynamic data ingestion. -Proficiency in schema design for vector databases to support scalable user ID associatio...

    $235 (Avg Bid)
    $235 Avg Bid
    11 bids

    ...managing all enterprise documents efficiently. 2. Goals Automate document processing and categorization using AI. Enhance document search capabilities through Bing integration. Improve information accessibility and collaboration across the organization. Reduce administrative tasks associated with document management. Enhance document security and compliance. 3. Functional Requirements Document Ingestion: Support various document formats (e.g., PDF, Word, Excel, image). Automatic document import from various sources (e.g., email, scanner, file systems). Manual document upload with metadata tagging. Document Processing: Automatic text extraction using Optical Character Recognition (OCR). Content analysis and keyword identification using Natural Language Processing (NLP). Docume...

    $6356 (Avg Bid)
    $6356 Avg Bid
    51 bids

    I am in immediate need of a seasoned data engineer with proven experience in Azure, Spark, and Python to assist with a critical data analysis project. Here’s what you’ll be diving into: - Data Ingestion: Implementing robust mechanisms to ingest data into Azure. - Data Transformation: Applying transformations using Spark to prepare the data. - Data Analysis: Conducting deep analysis to extract actionable insights. Ideal Skills: - Proficiency in Azure cloud services - Expertise in Apache Spark for big data processing - Strong programming skills in Python - Experience with data modeling and processing - Ability to document and communicate findings clearly This project requires someone who can ensure efficienc...

    $981 (Avg Bid)
    $981 Avg Bid
    15 bids

    ...with cutting-edge technologies. Your role will be fundamental in implementing solutions that streamline operations, enhance our data-handling capabilities, and ultimately boost our project's efficiency. Here's a snapshot of what I need and what you'd be working on: - Developing Robotic Process Automation (RPA) to automate mundane, repetitive tasks. - Creating sophisticated data classification models. - Managing data ingestion seamlessly from our AWS platform. Ideal Skills and Experience: - Proficient with AWS for data management and integration. - Experienced in setting up RPA systems for business process automation. - Skilled in designing and implementing data classification algorithms. - Ability to translate requirements into ...

    $1155 (Avg Bid)
    $1155 Avg Bid
    58 bids

    I am seeking for a data scientist for 1 time project to create a model prototype with synthetic data. Tasks: 1. Create synthetic data based on some datasets that were created previously. Combine these datasets, keep only important rows and afterwards create dataset. 2. Then based on already premade code create additional samples of data for model ingestion. 3. Execute in Jupiter notebook logistic model regressions (already made code) and calculate performances with building some performance graphs. This is not a solo project so you will need to communicate with me closely for guidance on each step. This is part of my volunteer project for creating a model concept without real data, due to lack of time need hands on help with finalizing the projec...

    $27 / hr (Avg Bid)
    $27 / hr Avg Bid
    37 bids

    As a passionate data enthusiast, I am looking for an expert developer to build a versatile and efficient web scraping tool. The application should extract three types of information from a vast number of URLs, domains, including: - Textual content - Image files - Numerical data - SSL status - Domain registration date - and more info...... I anticipate the tool to be proficient in handling high volumes of information and optimize for speed while maintaining noteworthy accuracy. Once collected, the data should be organized into a common table without distinction of domains. Though not explicitly specified, those who are proficient in Python, JavaScript, or Java may find this project of interest given the nature of the task. However, I am not partial to any programmin...

    $22 / hr (Avg Bid)
    $22 / hr Avg Bid
    35 bids

    ...chickpeas, curds, quinoa, Greek yogurt, peanuts, and almonds, likewise capability as huge supporters of protein A: It controls the safe framework. Known as the "counter infective nutrient, this supplement keeps your skin, mouth, stomach, and lungs sound with the goal that they can battle contamination. It's additionally key for acute sight. Consume it with some fat for better ingestion. Yam, pumpkin, carrots, and spinach are stacked with Vitamin A. L-ascorbic acid: It helps the body fabricate solid skin and connective tissue, which obstructs the passage of unfamiliar microorganisms. L-ascorbic acid additionally goes about as a cancer prevention agent shielding cells from harm. It additionally safeguards against weakness by assisting us with engrossing additional iron ...

    $15 / hr (Avg Bid)
    $15 / hr Avg Bid
    40 bids

    Project Title: Snowflake, DBT, Dataiku project Support - Urgent I am in need of a data engineer freelancer who can provide urgent support for the project. The ideal candidate should have experience and expertise in working with Snowflake, DBT, Dataiku and be able to solve problems Specific Requirements: - Creating Data products in DBT with source and target as snowflake warehouse. - Integration of Dataiku with existing systems and databases. - Experience with data ingestion. - create jobs in octopus, tidal Skills and Experience: - Proven experience in implementing Snowflake, DBT and Dataiku - Strong knowledge of Dataiku features and functionalities - Familiarity with data integration and data management best practices - Proficiency in SQL, Pytho...

    $30 (Avg Bid)
    $30 Avg Bid
    2 bids

    We are in search of a skilled website designer to join our team and create/design a dynamic online presence that involves video ingestion and a backend that allows for the tracking and selection of those submissions. We are looking to effectively connect with today's youth and align with their current/normal online experiences, the designer will play a crucial role in developing youth-focused fonts/images/experiential flow that resonate with our target audience. The website should not only be visually appealing but also user-friendly and intuitive, ensuring a seamless experience for young entrepreneurs and viewers alike

    $69 / hr (Avg Bid)
    $69 / hr Avg Bid
    45 bids

    Looking for an expert Azure Data Engineer to assist with multiple tasks. Your responsibilities will include: - Implementing and managing Azure Data Lake and Data Ingestion. - Developing visual reports, KPI scorecards, and dashboards using Power BI. - Creating, deploying, and managing resources in Powerapps. I need someone who can utilize these platforms to achieve three main objectives: - Perform sophisticated data analysis and visualization. - Enable advanced data integration and transformation. - Build custom applications to meet specific needs. Candidates should have an advanced understanding of Azure Data Lake, Power BI, and Powerapps, bringing a minimum of 6 years experience as Databricks. Proficiency in Python, SQL, PostGre SQL,...

    $46 / hr (Avg Bid)
    $46 / hr Avg Bid
    28 bids

    Project Title: Pyspark Data Engineering Training Overview: I am a beginner/intermediate in Pyspark and I am looking for a training program that focuses on data processing. I prefer one on one and written guides as the format for the training. Skills and Experience Required: - Strong expertise in Pyspark and data engineering - Excellent knowledge of data processing techniques - Experience in creating and optimizing data pipelines - Familiarity with data manipulation and transformation using Pyspark - Ability to explain complex concepts in a clear and concise manner through written guides - Understanding of best practices for data processing in Pyspark Training Topics: The training should primarily focus on data processing. The foll...

    $31 / hr (Avg Bid)
    $31 / hr Avg Bid
    72 bids

    ...Databricks from an on-prem audit file. The project requirements are as follows: Frequency of ingestion: - The audit file needs to be ingested into Databricks on a daily basis. Size of the audit file: - The approximate size of the audit file is between 1GB and 10GB. Desired output format in Databricks: - The desired output format in Databricks is Delta Tables. Ideal skills and experience for the job: - Experience with ingesting data into Databricks from on-prem sources. - Proficiency in working with Delta Tables in Databricks. - Strong understanding of data ingestion processes and best practices. - Familiarity with handling large file sizes for efficient data processing. - Ability to set up automated ingestion processes for daily updates. I...

    $30 / hr (Avg Bid)
    $30 / hr Avg Bid
    35 bids

    Not response on this project. This is test projects to analyze bots on this sites. I'm looking to develop a spring boot application primarily for web ...This is test projects to analyze bots on this sites. I'm looking to develop a spring boot application primarily for web development. The app should be designed to handle data processing microservices, more specifically focusing on data ingestion. The ideal candidate would possess strong skills in Spring Boot, microservices, and data processing. They should understand web development thoroughly and have experience in implementing data ingestion functionality. Key skills and experiences include: - Expertise in Spring Boot - Proficiency in web development - Experience with data proc...

    $626 (Avg Bid)
    $626 Avg Bid
    75 bids

    ...is capable of talk to my internal data The model architecture should be efficient, scalable, and capable of handling large datasets. Custom Data Training: Implement a training pipeline that allows the AI model to be trained on our proprietary internal data. Ensure that the software supports easy data ingestion and preprocessing for model training. Integration: Integrate the AI software into our existing infrastructure. Provide documentation and support for integration with our internal systems. User Interface: Develop a user-friendly interface for interacting with the AI software. Include features for data input, model configuration, and result visualization. Scalability: Design the software to be scalable, allowing for future expansion and ...

    $311 (Avg Bid)
    $311 Avg Bid
    21 bids

    ...managing all enterprise documents efficiently. 2. Goals Automate document processing and categorization using AI. Enhance document search capabilities through Bing integration. Improve information accessibility and collaboration across the organization. Reduce administrative tasks associated with document management. Enhance document security and compliance. 3. Functional Requirements Document Ingestion: Support various document formats (e.g., PDF, Word, Excel, image). Automatic document import from various sources (e.g., email, scanner, file systems). Manual document upload with metadata tagging. Document Processing: Automatic text extraction using Optical Character Recognition (OCR). Content analysis and keyword identification using Natural Language Processing (NLP). Docume...

    $4765 (Avg Bid)
    $4765 Avg Bid
    60 bids

    Databricks Data Engineer Scope of the project: - Data ingestion and processing Preferred programming language: - Python Specific tools or technologies: - Delta Lake Skills and experience required: - Strong knowledge of Python programming language - Proficiency in data ingestion and processing - Experience with Delta Lake technology - Familiarity with Databricks platform and Apache Spark is a plus - Data Modelling

    $28 / hr (Avg Bid)
    $28 / hr Avg Bid
    41 bids

    Experience in designing, implementing, and managing large-scale infrastructure projects. Expertise in setting up and configuring OpenShift clusters on bare-metal servers. Strong knowledge of networking requirements for high-performance,...implementing, and managing large-scale infrastructure projects. Expertise in setting up and configuring OpenShift clusters on bare-metal servers. Strong knowledge of networking requirements for high-performance, low-latency communication. Experience in designing and optimizing Spark clusters, including auto-scaling mechanisms. Proficiency in setting up and configuring MinIO for high scalability and data ingestion. In-depth knowledge of Elasticsearch cluster design, optimization, and security. Strong scripting and automation skills (e.g., Ans...

    $956 (Avg Bid)
    $956 Avg Bid
    5 bids

    ...Python to build a cutting-edge chatbot platform. This platform will specialize in ingesting PDFs, text files, and various documents to create a custom chatbot tailored to individual company data and style. Key features include business registration, document upload for data storage, and dynamic chatbot training to reflect the unique voice of each company. The role demands continuous updates and improvements to ensure the chatbot remains current with daily company-specific information. Responsibilities: Develop and maintain a platform using Langchain and Python. Implement features for document ingestion, data storage, and chatbot training. Create a user-friendly interface for business registration and document management. Ensure regular updates and enhanceme...

    $697 (Avg Bid)
    $697 Avg Bid
    98 bids

    Im looking for someone to make a system where users either connect with an SSO or enter in API credentials depending on what's available for the site. Th...with an SSO or enter in API credentials depending on what's available for the site. The goal is a user wants to connect all of the accounts they have. All of their data and messages will all be connected and downloaded and put into a vector database / into multi-modal processing to be conversed with using AI. So for example, there would be a form where I 'login my gmail' or maybe gmail won't let me and i have to enter in my api key or oauth, regardless, once i do so, all my emails are put into my vector database. Every user has their own database. We will constantly add more datasources and more ways ...

    $19 / hr (Avg Bid)
    $19 / hr Avg Bid
    45 bids

    Assignment Log Ingestor and Query Interface Deadline Sunday, 19 November, 11.59 pm (IST) Objective Develop a log ingestor system that can efficiently handle vast volumes of log data, and offer a simple interface for querying this data using full-text search or specific field filters. Both the systems (the log ingestor and the query interface) can be built using any programming language of your choice. The logs should be ingested (in the log ingestor) over HTTP, on port 3000. We will use a script to populate the logs into your system, so please ensure that the default port is set to the port mentioned above. Sample Log Data Format: The logs to be ingested will be sent in this format. JSON { "level": "error", "message": "Failed...

    $25 (Avg Bid)
    $25 Avg Bid
    3 bids

    Assignment Log Ingestor and Query Interface Deadline Sunday, 19 November, 11.59 pm (IST) Objective Develop a log ingestor system that can efficiently handle vast volumes of log data, and offer a simple interface for querying this data using full-text search or specific field filters. Both the systems (the log ingestor and the query interface) can be built using any programming language of your choice. The logs should be ingested (in the log ingestor) over HTTP, on port 3000. We will use a script to populate the logs into your system, so please ensure that the default port is set to the port mentioned above. Sample Log Data Format: The logs to be ingested will be sent in this format. JSON { "level": "error", "message": "Failed...

    $15 (Avg Bid)
    $15 Avg Bid
    5 bids

    Looking for someone who has a good knowledge of Pyspark, Airflow DAGs, GitHub, Pandas and Agile Framework. Overall candidate should be well aware of the data ingestion approach. Knowledge of Google cloud platform is a Bonus

    $400 (Avg Bid)
    $400 Avg Bid
    24 bids

    Project Title: Data Platform with Snowflake integration Description: I am looking for a skilled developer to create a robust data platform with Snowflake integration. The platform will require the following: Data Sources: - Integration with API endpoints to retrieve data for processing and analysis. Data Ingestion: - Batch ingestion method preferred for data ingestion. Scalability and Performance: - High scalability and performance are expected for the data platform to handle large volumes of data efficiently. Ideal Skills and Experience: - Strong experience in working with Snowflake and integrating it with various data sources. - Proficiency in developing data platforms with API integrations. -...

    $634 (Avg Bid)
    $634 Avg Bid
    87 bids

    Store Sales Data Analysis: A Data Engineering Capstone Project Project Overview The project aims to analyze global sales data to offer actionable insights into regional sales trends, item popularity, and profitability. Real-World Implications • Optimizing Inventory: Know what items sell well in which regions. • Sales Strategy: Develop targeted sales strategies for different markets. Target Audience • Sales Managers • Business Analysts • Data Scientists Technologies and Tools • Data Processing: Pandas, Spark • Query Language: Hive • Data Visualization: Matplotlib, Seaborn • Big Data Technologies: HDFS, YARN Data Source The dataset includes: • Transaction Information: Region, Country, It...

    $109 (Avg Bid)
    $109 Avg Bid
    10 bids

    Store Sales Data Analysis: A Data Engineering Capstone Project Project Overview The project aims to analyze global sales data to offer actionable insights into regional sales trends, item popularity, and profitability. Real-World Implications • Optimizing Inventory: Know what items sell well in which regions. • Sales Strategy: Develop targeted sales strategies for different markets. Target Audience • Sales Managers • Business Analysts • Data Scientists Technologies and Tools • Data Processing: Pandas, Spark • Query Language: Hive • Data Visualization: Matplotlib, Seaborn • Big Data Technologies: HDFS, YARN Data Source The dataset includes: • Transaction Information: Region, Country, It...

    $96 (Avg Bid)
    $96 Avg Bid
    4 bids

    I am looking for a developer to create an API ingestion billing process for my project. Here are the requirements: Programming Language: Python - The billing process should track detailed usage statistics for each API call. - I am considering integrating the API with an existing billing system. - If an existing billing system is not available, I would like one created from scratch. Ideal Skills and Experience: - Proficiency in Python programming language. - Experience with API development and integration. - Knowledge of billing systems and API usage tracking. - Strong attention to detail for accurate tracking of usage statistics.

    $688 (Avg Bid)
    $688 Avg Bid
    40 bids

    1. changes to use the new db removing the tab. 2. record usage in mongo 3. (optional) if possible i want to show plantulm diagrams. using public server later build my own so change via var 4. (optional) if possible the ability to use a coupon with stripe on subscription setup so we can discount specific registrations. updating the ingestion script

    $394 (Avg Bid)
    $394 Avg Bid
    1 bids

    I am looking for an experienced HDFS and PySpark expert to assist me with various tasks related to data ingestion, storage, processing, and analysis. The ideal freelancer should have a strong background in these technologies and be able to provide past work examples that showcase their expertise. Key requirements: - Expertise in HDFS and PySpark Timeline: - The project is expected to be completed within 1-2 weeks. If you meet these requirements and have the necessary experience, please include details of your past work and relevant experience in your application.

    $67 / hr (Avg Bid)
    $67 / hr Avg Bid
    7 bids

    InfluxDB and its query language - Familiarity with RESTful API development Project Requirements: - Develop a Spring Boot application to query data from an existing InfluxDB database - Implement necessary API endpoints to fetch data based on specific requirements - Detailed requirements for the API will be provided by the client - Ensure efficient and optimized database queries for improved performance Few basic requirements are 1. Searching in influxdb and provide result in json format 2. Sending Periodic Mails from Influxdb query based on alert or scheduled reports via email in html format 3. Creating users / buckets and expert query 4. MQTT INGESTION Exp with CrateDB is a plus Nodered functions writing is a must or nodejs Additional Information: - The fr...

    $772 (Avg Bid)
    $772 Avg Bid
    22 bids

    I am looking for a data engineer with expertise in Python, SQL, and data visualization. The ideal candidate will be able to work with structured data and have experience in data analysis. The end goal of this project is to perform data analysis using the provided structured data. Recommended Skills: 5+ years building large-scale streaming data platforms on the cloud. Expert in technologies like Kafka, Flink, and Spark Streaming. Proficiency in SQL, Python(or Java). Experience with Scala is a plus. Expertise leveraging monitoring tools like New Relic, SolarWinds, Prometheus, Grafana, Strong hands-on experience managing AWS big data services. Responsibilities: - Design and implement robust data pipelines to efficiently collec...

    $20 - $40 / hr
    Sealed
    $20 - $40 / hr
    21 bids

    Hello! I am in need of assistance with a task that needs to be completed urgently (within 24 hours). Specifically, I need help with data ingestion, so it is essential that whoever I choose has the necessary skills and expertise to get the job done. No detail will be spared in completing this job, so accuracy and promptness is of the highest priority. I understand that the timeline for completion is urgent, but this must be done with the utmost care and attention to detail. If you are confident in your abilities and think you can provide assistance with what I need, I would be grateful if you could apply for the project. Thank you for considering.

    $24 / hr (Avg Bid)
    $24 / hr Avg Bid
    9 bids

    ...nitrogen tetroxide oxidizer (NTO) have flexible long term reusable elastomeric materials available, and a bellows system has limitations on percentage capability of transfer and life-cycle reusability. See: Less than 5% liquid ingestion by volume into the gas/vapor stream being expelled from a tank while venting is allowed. Ideally, 0% ingestion is preferred, but it is understood that some residual liquid amounts entering the vent may be inevitable depending on the solution design. An example of 5% liquid ingestion to vent space: Suppose that, prior to venting, a tank holds 100 gallons of liquid. After venting, 5 gallons of liquid or less are “lost” to the vent resulting in a remaining amount of 95 to 100 gallons of liquid in the tank. Solution Requi...

    $40470 (Avg Bid)
    Featured Guaranteed Sealed Top Contest
    $40470
    44 entries

    Project Title: Azure data factory Specific Tasks: - Data integration in Azure Data Factory Data Sources and Destinations: - Both data source and destination Preferred Data Integration Pattern: - Real-time streaming Ideal Skills and Experience: - Strong knowledge and experience in Azure Data Factory - Expertise in data integration and real-time streaming - Familiarity with batch processing and hybrid data integration patterns - Ability to configure and customize data sources and destinations in Azure Data Factory - Proficiency in designing and implementing real-time streaming solutions - Understanding of data transformation and data ingestion processes

    $24 (Avg Bid)
    $24 Avg Bid
    9 bids