Hadoop is an open-source software platform that supports the distributed processing of large datasets across clusters of computers, enabling organizations to store and analyze unstructured data quickly and accurately. With the help of a Hadoop Consultant, this powerful software can scale your data architecture and allow organizations to capture, store, process and organize large volumes of data. Hadoop offers a variety of features including scalability, high availability and fault tolerance.

Having an experienced Hadoop Consultant at your side can help develop projects that take advantage of this powerful platform and maximize your big data initiatives. Hadoop Consultants can create custom applications that integrate with your existing infrastructure to help you accelerate analytics, process large amounts of web data, load different levels of insights from unstructured sources like internal emails, log files, streaming social media data and more for a wide variety of use cases.

Here’s some projects our expert Hadoop Consultant created using this platform:

  • Desgined arrays of algorithms to support spring boot and microservices
  • Wrote code to efficiently process unstructured text data
  • Built python programs for parallel breadth-first search executions
  • Used Scala to create machine learning solutions with Big Data integration
  • Developed recommendation systems as part of a tailored solution for customer profiles
  • Constructed applications which profiled and cleaned data using MapReduce with Java
  • Created dashboards in Tableau displaying various visualizations based on Big Data Analytics

Thanks to the capabilities offered by Hadoop, businesses can quickly gain insights from their unstructured dataset. With the power of this robust platform at their fingertips, Freelancer clients have access to professionals who bring the experience necessary to build solutions from the platform. You too can take advantage of these benefits - simply post your Hadoop project on Freelancer and hire your own expert Hadoop Consultant today!

10,057 értékelés alapján az ügyfelek 5 / 5 csillagot adtak Hadoop Consultants szabadúszónknak.
Alkalmazza Hadoop Consultants szabadúszót

Hadoop is an open-source software platform that supports the distributed processing of large datasets across clusters of computers, enabling organizations to store and analyze unstructured data quickly and accurately. With the help of a Hadoop Consultant, this powerful software can scale your data architecture and allow organizations to capture, store, process and organize large volumes of data. Hadoop offers a variety of features including scalability, high availability and fault tolerance.

Having an experienced Hadoop Consultant at your side can help develop projects that take advantage of this powerful platform and maximize your big data initiatives. Hadoop Consultants can create custom applications that integrate with your existing infrastructure to help you accelerate analytics, process large amounts of web data, load different levels of insights from unstructured sources like internal emails, log files, streaming social media data and more for a wide variety of use cases.

Here’s some projects our expert Hadoop Consultant created using this platform:

  • Desgined arrays of algorithms to support spring boot and microservices
  • Wrote code to efficiently process unstructured text data
  • Built python programs for parallel breadth-first search executions
  • Used Scala to create machine learning solutions with Big Data integration
  • Developed recommendation systems as part of a tailored solution for customer profiles
  • Constructed applications which profiled and cleaned data using MapReduce with Java
  • Created dashboards in Tableau displaying various visualizations based on Big Data Analytics

Thanks to the capabilities offered by Hadoop, businesses can quickly gain insights from their unstructured dataset. With the power of this robust platform at their fingertips, Freelancer clients have access to professionals who bring the experience necessary to build solutions from the platform. You too can take advantage of these benefits - simply post your Hadoop project on Freelancer and hire your own expert Hadoop Consultant today!

10,057 értékelés alapján az ügyfelek 5 / 5 csillagot adtak Hadoop Consultants szabadúszónknak.
Alkalmazza Hadoop Consultants szabadúszót

Szűrő

Legutóbbi kereséseim
Szűrés erre:
Költségvetés
eddig:
eddig:
eddig:
Típus
Készségek
Nyelvek
    Munka státusza
    12 munkát találtunk

    The goal is to design a future-proof relational backbone for a multi-channel retail platform, built natively on PostgreSQL. I already run an operational system, and its live transactional records will have to be migrated into the new structure without downtime. Everything else—users, inventory, orders—will start fresh. Core design expectations The operational schema should be fully normalised to 3NF and optimised from day one for high concurrency. Think table partitioning (range and list), well-chosen GIN and B-Tree indexes, plus JSONB columns where semi-structured flexibility makes sense. Triggers, constraints and stored procedures must enforce business logic consistently, so the application tier can stay lightweight. Analytics layer Alongside the OLTP schema I need a se...

    €1390 Average bid
    €1390 licitátlag
    67 árajánlat

    I need an experienced Python engineer who works confidently with AWS Glue to build and manage a small suite of data-integration jobs for a Hyderabad-based project. The core of the work is to design and automate Glue ETL pipelines that pull data from our production databases, catalog it accurately, and transform it into analytics-ready tables. Here is what I expect from the engagement: • Develop, test, and deploy Glue ETL jobs in Python. • Populate and maintain the Glue Data Catalog so new tables are discoverable and properly version-tracked. • Implement efficient transformation logic that cleans, enriches, and partitions data for downstream reporting. • Optimise job performance and cost by selecting the right worker types, job parameters, and database connectio...

    €83 Average bid
    €83 licitátlag
    6 árajánlat

    Job Title: Senior Cloud Database Engineer – PostgreSQL Migration Architect Location: Remote Work Timing: UK Business Hours (GMT/BST) Engagement Type: Freelance (Long-Term Possible) Platform: Freelancer.com Project Overview We are seeking a highly skilled and experienced Senior Cloud Database Engineer with deep specialization in PostgreSQL migrations and distributed database infrastructure. This role is for a demanding enterprise client, so we require a proven track record of successful large-scale cloud database migrations with verifiable experience. You must be comfortable working during UK business hours and collaborating with cross-functional cloud and infrastructure teams. Role Summary Senior Cloud Database Engineer – PostgreSQL Migration Architect – Data Infras...

    €986 Average bid
    €986 licitátlag
    10 árajánlat

    I have a Hadoop cluster holding several large data sets, and I need a seasoned PySpark developer who also writes rock-solid SQL. The immediate aim is to connect to the cluster (YARN/HDFS with Hive metastore), develop or refine PySpark jobs, optimise the accompanying SQL, and make sure everything runs smoothly end-to-end. You’ll receive access to a staging namespace plus a sample of the data. Once the logic checks out we’ll promote the code to the full environment. Deliverables • A clean, well-commented PySpark notebook or .py job that executes successfully on the cluster • The corresponding SQL script or view definitions ready for Hive or spark-sql • A concise README detailing execution steps, parameters, and expected outputs Acceptance criteria &bul...

    €67 Average bid
    €67 licitátlag
    8 árajánlat

    I need an experienced Python engineer who works confidently with AWS Glue to build and manage a small suite of data-integration jobs for a Hyderabad-based project. The core of the work is to design and automate Glue ETL pipelines that pull data from our production databases, catalog it accurately, and transform it into analytics-ready tables. Here is what I expect from the engagement: • Develop, test, and deploy Glue ETL jobs in Python. • Populate and maintain the Glue Data Catalog so new tables are discoverable and properly version-tracked. • Implement efficient transformation logic that cleans, enriches, and partitions data for downstream reporting. • Optimise job performance and cost by selecting the right worker types, job parameters, and database connectio...

    €79 Average bid
    €79 licitátlag
    12 árajánlat

    Hiring: DevOps Developers DevOps Engineer Requirements: -Expertise in CentOS, Ubuntu, Debian -Experience with HBase, Hadoop, Storm, ArangoDB, Prometheus -Strong benchmarking and system scalability testing skills Responsibilities: -Configure and optimize high-availability servers and databases -Implement monitoring and performance evaluation tools -Review and enhance DevOps workflows Long-term opportunities available!

    €417 Average bid
    €417 licitátlag
    18 árajánlat

    I’m standing up a series of production data pipelines and need an IT professional who can move comfortably between Python scripting, SQL optimisation, PySpark transformations and Airflow orchestration. The immediate focus is end-to-end pipeline build-out: designing clean ingestion logic, transforming data in Spark, writing efficient queries and scheduling everything through well-structured Airflow DAGs. If you can demonstrate hands-on experience across all four technologies - Python, SQL, PySpark and Airflow—and enjoy owning a pipeline from raw source to curated output, I’d like to work together. Deliverables I’m expecting: • A working set of PySpark jobs that handle ingestion, transformation and output staging • Airflow DAGs that schedule, monit...

    €236 Average bid
    €236 licitátlag
    27 árajánlat

    - **Core Architecture:** Spring Cloud + Kafka + Hadoop + Python Automation,This project requires a certain level of technical expertise.

    €15 Average bid
    €15 licitátlag
    13 árajánlat

    I have an existing analytics initiative that now needs a dedicated Redshift-based warehouse. The core objective is to design and implement a robust schema in Amazon Redshift, then ingest data coming from three different sources—our operational SQL databases, a set of RESTful APIs, and periodic flat-file drops in CSV or JSON. Here is what I’m aiming for: • A well-structured Redshift warehouse (star or snowflake schema, whichever is most appropriate) built to scale and documented clearly. • Reliable, automated ingestion pipelines for each source type. For SQL we currently use PostgreSQL and MySQL; for APIs the payloads are mostly JSON; the flat files live in S3. • Transformations that standardise data types, handle slowly changing dimensions, and enforce dat...

    €82 Average bid
    €82 licitátlag
    8 árajánlat

    I need a propensity-modelling software package that plugs directly into our CRM system, website analytics, and sales database, unifying those streams into a clean, continuously updated dataset. On top of that data layer, the build must train, evaluate, and deploy the best-performing predictive models—whether regression, decision-tree, neural-network, or any other technique that proves superior—then surface the results through a lightweight web interface and an API our teams can call in real time. Key deliverables • Automated ETL jobs and data-quality checks for the three sources mentioned above • Modular training pipeline with experiment tracking, lift/ROC reporting, and feature-importance visuals • Scoring service exposed via REST (or GraphQL) endpoints plu...

    €13118 Average bid
    €13118 licitátlag
    34 árajánlat

    Key Responsibilities: • Design, develop, and deploy AI/ML solutions end-to-end • Lead AI architecture and solution design for enterprise applications • Build and optimize machine learning and deep learning models • Deploy and monitor models in production environments • Collaborate with cross-functional teams including product and engineering • Mentor junior AI engineers and contribute to technical leadership • Conduct research and implement state-of-the-art AI techniques • Ensure data quality, security, and model performance optimization Required Skills & Qualifications: • 10+ years of experience in AI/ML or Software Engineering roles • Strong proficiency in Python and data processing libraries (NumPy, Pandas) • Hands-on experienc...

    €13 / hr Average bid
    €13 / hr licitátlag
    24 árajánlat

    Project Description My current production workload runs entirely on AWS EC2, and while usage grows, monthly costs are increasing and application response times are degrading. I’m looking for an experienced AWS / DevOps engineer to perform a data-driven infrastructure audit and recommend optimizations. The engagement will begin with a deep-dive assessment of the existing environment, followed by a cost- and performance-balanced architecture proposal. All findings must be supported with measurable evidence, not assumptions. Phase 1 — Infrastructure Audit The audit should cover: EC2 CPU, memory, and disk I/O utilization Database behavior and performance (MySQL) Network flow and latency analysis CloudWatch metrics and logs review Identification of bottlenecks, ...

    €103 Average bid
    €103 licitátlag
    29 árajánlat

    Ajánlott Cikkek Kifejezetten Önnek