Your guide to getting data entry done for your business
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Data Scraping is a process of extracting data from websites and databases. Data Scrapers are experts who can help collect this data and save clients both time and money by automating the process of data collection. These professionals use tools like Python, HTML, XML, Laravel, and more to access webpages and other sources of digital information in order to scrape large amounts of data and analyze it in meaningful ways.
Here's some projects that our expert Data Scraper made real:
Data Scraping is an incredibly valuable tool that can help companies increase efficiency by improving the process of their digital databasing. Our expert team of Data Scrapers is well-equipped to make these improvements in whatever form they’re needed. If you’re looking to make improvements to your own business through data scraping then why not post your project on Freelancer.com? Our Data Scrapers are ready to help you reach your goals.
134,018 értékelés alapján az ügyfelek 4.9 / 5 csillagot adtak Data Scrapers szabadúszónknak.Data Scraping is a process of extracting data from websites and databases. Data Scrapers are experts who can help collect this data and save clients both time and money by automating the process of data collection. These professionals use tools like Python, HTML, XML, Laravel, and more to access webpages and other sources of digital information in order to scrape large amounts of data and analyze it in meaningful ways.
Here's some projects that our expert Data Scraper made real:
Data Scraping is an incredibly valuable tool that can help companies increase efficiency by improving the process of their digital databasing. Our expert team of Data Scrapers is well-equipped to make these improvements in whatever form they’re needed. If you’re looking to make improvements to your own business through data scraping then why not post your project on Freelancer.com? Our Data Scrapers are ready to help you reach your goals.
134,018 értékelés alapján az ügyfelek 4.9 / 5 csillagot adtak Data Scrapers szabadúszónknak.Each month I have a small, repeatable task that takes about half an hour: entering and checking mixed text-and-numeric data that arrives in a spreadsheet. The flow is always the same, so once you understand the routine it should feel quick and predictable. Accuracy matters more than speed, and I need the file returned on the agreed date every month without exception. Because the dataset is drawn directly from spreadsheets, you’ll simply open the source sheet, copy or key the mixed fields into the target file, make sure everything lines up exactly, then save and send it back. That’s it—no complex formulas, no macros, just careful data handling. I’m only able to work with U.S.-based freelancers for this role. If you have reliable month-to-month availability and a ...
I have several datasets that need to be transformed into clean, well-structured information. At this stage I’m still defining the exact scope, so I’m open to a range of data-writing approaches—whether that involves straightforward data entry from spreadsheets, deeper analytical summaries drawn from a database, or shaping web-scraped records into a narrative that can power a report, a machine-learning pipeline, or an interactive dashboard. Here’s what I need from you: • A brief outline of how you would tackle the job once you see the raw files. • Your preferred tools—Excel, SQL, Python (Pandas, NumPy), or any other stack you’re comfortable with. • A sample of comparable work, if available, so I can gauge fit and style. • An es...
I need you to collect 1,000 product images, titles, and links from Temu. Just these three items. I think this should be pretty straightforward. If you do a good job, I’ll hire you on a long-term basis, as I’ll need someone to help me collect product information from Temu on an ongoing basis.
I have several Excel spreadsheets containing raw customer information that now need to be turned into a single, reliable data set. Your task is to go through the files, eliminate duplicates, fix obvious entry errors, standardise formats (names, phone numbers, email addresses, dates), and flag anything that looks inconsistent or suspicious. I will provide the spreadsheets plus a short style guide so you know exactly how each field should look when it is “clean”. You’re free to use any solid Excel-based method—formulas, Power Query, VBA, even third-party tools—so long as the end result meets the guidelines and can be inspected easily inside Excel. Deliverables • A cleaned master worksheet with all customer records consolidated and validated • A con...
I need a fresh dataset of 100,000 United-States–based companies delivered in a single Excel spreadsheet. Each row should contain four columns only: Company Name, Company Domain, CEO Full Name, and CEO Email Address. The file will feed an internal research project, so accuracy, recency, and proper formatting matter more to me than speed. What matters most: • Verified U.S. businesses (no subsidiaries registered abroad). • CEO details must match the domain’s current leadership—no outdated founders or generic “info@” emails. • Clean Excel formatting: one header row, no merged cells, UTF-8 characters intact. Before final delivery, I would like a random 200-row sample to spot-check for bounce-free emails and correct executive names. If more th...
I need a clean, well-segmented file of 10,000 active, verified email addresses drawn exclusively from professionals across the United States. The focus is on three core sectors: • Interior designers and decorators • Small contractors and builders • Kitchen and bath remodelers Names pulled from wallpaper installers, architects, or other adjacent trades are welcome only if they clearly fit one of the groups above; otherwise, please exclude them. Preferred harvesting grounds include Houzz, Google Maps, LinkedIn, and comparable industry directories where firm websites and direct decision-maker details are visible. I expect you to combine smart scraping techniques with manual cross-checks and a reputable validation tool so that every address passes a syntax, domain, an...
Hi, I need a data scraper who can scrap a data from provided sources. Skills : Core Technical Skills The freelancer should know Python (the most common scraping language) with libraries like Scrapy, BeautifulSoup, or Playwright. They should also be comfortable with browser automation tools like Selenium or Puppeteer (JavaScript-based), since sites like TipRanks are JavaScript-heavy and need a real browser to render. Anti-Bot Bypass Experience This is the most critical skill for your specific case. Look for someone experienced with handling CAPTCHAs (2Captcha, Anti-Captcha services), rotating proxies and residential IPs, spoofing browser headers and fingerprints, and bypassing Cloudflare or similar bot protection. TipRanks specifically uses these protections, so this experience is non-...
A reusable Python script is required to automate data scraping from a series of publicly accessible web pages. The script should accept a list of URLs, navigate through any paginated content, extract the specified fields, and save the results to CSV and JSON. The task suits someone with an intermediate grasp of Python who is comfortable working with libraries such as requests, BeautifulSoup, pandas, or, when a site relies on JavaScript, Selenium or Playwright. Clear, well-commented code and concise setup instructions are essential so the script can be dropped into an existing workflow without modification. Acceptance criteria and deliverables: • Fully functional .py script that runs from the command line. • Configuration section (or .env file) for URL list and field selec...
I need a clean, reliable dataset of 100,000 unique Australian-based websites together with their top-level decision maker—whether that person is titled President, CEO, Founder, or Owner. For every record include: • Website URL • Executive’s full name • Role / title as it appears publicly • Primary email address (direct where possible, otherwise a verifiable company email) • Direct phone number or main business line if no direct dial exists A simple spreadsheet (Excel or CSV) is ideal so I can filter, sort, and upload it into my CRM. I am fine if you scrape, enrich via LinkedIn, or use tools like Hunter, Clearbit, or similar, provided the final list is accurate, deduplicated, and at least 95 % of emails pass a standard validity test (e.g....
B2B Lead Generation – North America/EU/UK/AU - IT Services, Software, Digital Marketing, Consulting SMEs Receiving Cross-Border Payments Project Overview: I’m hiring a highly detail-oriented researcher/prospector to build a targeted list of Small & Medium Enterprises (SMEs) in the North America/EU/UK/AU/UAE working in industries like IT Services, Software, Digital Marketing, Consulting SMEs Receiving Cross-Border Payments. This is not a bulk data scraping task. The goal is to identify high-intent, relevant businesses that are already involved in cross-border transactions and would be strong prospects for a global payments solution. Ideal Company Profile (ICP): - Industry: IT Services, Software Development, Tech Consulting, Digital Agencies - Geography: United States &...
We are looking for an experienced full-stack developer/team to build a web-based platform that aggregates auction data (ELV and related categories) from multiple government and private websites. The goal is to create a centralized dashboard + automated notification system. Key Requirements: 1. Data Scraping / Extraction Extract auction data from multiple websites (government + private portals) Handle different formats (HTML, tables, PDFs if possible) Schedule automated scraping (daily) 2. Dashboard (Frontend) Build using React.js Features: List of auctions (filter, search, sort) Source-wise filtering Date-wise filtering Detailed view of each auction Clean, professional UI 3. Backend System API to manage data Store data in database Handle scraping jobs (cron/scheduler) Prevent duplicat...
We are seeking a skilled freelancer to extract daily and historical stock market data from a finance website. The ideal candidate will have experience in data extraction, APIs and web scraping, with a strong understanding of finance and data analysis. The task involves collecting and organizing data into a structured format for further analysis. Should be a very easy and straighforward job; usually completed even within a few hours!
Buscamos un(a) profesional con experiencia en **minería de datos y organización de información** para apoyar en la recopilación y estructuración de contactos empresariales. **Responsabilidades:** * Investigar información utilizando Google y otras fuentes en línea. * Navegar sitios web corporativos para identificar datos clave. * Recopilar nombre y cargo del contacto, empresa, número telefónico y correo electrónico. * Organizar y consolidar la información en una hoja de cálculo en Google Drive, asegurando precisión y orden. **Requisitos:** * Experiencia comprobable en minería de datos. * Habilidad para realizar búsquedas avanzadas en línea. * Experiencia en asistencia virtual ...
Build a cross-platform desktop bot that uses computer vision (template matching) to automate web form filling and data scraping — controlled through a web-based dashboard, with no Selenium or browser drivers. We need a vision-based UI automation bot capable of navigating real browsers (Chrome, Firefox, Brave, Opera) installed on the user's machine. The bot will fill web forms and scrape report data using image-based template matching — no browser drivers or Selenium. A local web UI will serve as the control dashboard for operators to input data, monitor progress, and handle errors manually when needed. Core requirements 1. Cross-platform support Must run natively on Windows, Linux, and macOS without platform-specific hacks. 2. Native browser control Open and control the...
Every time my Python - Flask data reader starts it throws an execution failure, halting the entire application. The crash occurs during startup—well before any web data is retrieved—so nothing ever reaches the UI. The reader should open Chrome, load a web, read the data from it, and pass the structured data downstream utill it reaches an excel file. I need you to trace the root cause of this startup failure, patch the code, and verify that the reader launches cleanly and delivers correctly formatted data . This software works on an specific webpage. Deliverables • Patched reader module that starts without errors and can finish the entire loop execution of the application. • Evidence (Files created Dynamically on my public server's shared files) confirming...
I will hand over my FootyStats login and a checklist of leagues (about 120 team-stats files in total). Once inside the dashboard, simply open each specified competition, grab the Team Statistics spreadsheet in CSV format, and ignore every match or player dataset you see. Folder structure must stay clean: one top-level directory, then a separate folder for each league named exactly as FootyStats labels it. Do not create season subfolders. When every CSV is in place, compress the whole directory into a single ZIP, upload it (Google Drive, Dropbox, WeTransfer—whatever is fastest), and send me the link here. Confidentiality is critical; please handle the credentials securely and delete them when the job is done. I will verify that all requested leagues are present and that none of the ...
A Python script is needed to pull text-only data from a website / mobile-app endpoint that returns XML and drop everything into a single Excel worksheet, with each category neatly separated into its own column. The source exposes only XML responses, so parsing must rely on built-in or an equivalent library such as lxml; once parsed, the data should be pushed to Excel via pandas or openpyxl. The script should: • Retrieve the full XML payload from the target URL or API, following any necessary headers or authentication I provide. • Map every relevant XML tag to a clearly named column in one worksheet, preserving order and encoding. • Export a clean .xlsx file in one run and overwrite or version files automatically (configurable). Deliverables: 1. Well-commented ...
I need a focused lead-gen push aimed squarely at the e-commerce space, zeroing in on technology startups. My goal is a clean, verifiable list of prospects who are actively operating or servicing online stores and who fit a startup profile—typically under 150 employees and early- to mid-stage funding. Here’s the workflow I have in mind: • Research using platforms such as LinkedIn Sales Navigator, Apollo, Crunchbase or similar to spot genuine tech startups with an e-commerce angle. • Capture key data points: founder or senior decision-maker name, role, work email, LinkedIn URL, company URL, headcount and a brief note on their e-commerce product or service. • Verify every email for accuracy (0–5 % bounce rate on a small test sample before full delivery)....
Need weekly data of open, high, low and close and volume of list of 1215 stocks to be scraped from tradingview
I need someone who is very experienced with scrapping. I have 82 Christian Television channels on my web site. I pull show / program names and times from a json file onto my web site. The 1 (single) json file contains scrapped show names and show times for Weekdays and Weekends for multiple channels. The person who has done this for me many times in past is terrible at his job. I need hire someone who is accurate and can scrape not all 82 channels but rather more like 68 channels. You will do this for me once a month. You will scrape show names along with times / days for 68 channels. I will provide you all the links for tv guides on each web site for each channel. You will add scrape data into 1 big JSON file. You must follow my web site channel names, time zone and javascript l...
I need a reliable pro who can take raw files and run a fast, fully automated data-entry workflow, then hand back a spotless dataset right away. The job is urgent, so I’m aiming for a turnaround that feels like “ASAP” rather than “next week.” Here’s exactly what I expect you to deliver: • Duplicate removal • Error correction (typos, misplaced fields, obvious inconsistencies) • Consistent data formatting across every column and row All entry should be handled through an automated process—scripts, macros, Python, R, whatever tool you trust for speed and accuracy—so the final file is ready for immediate analysis with no manual fixes on my side. Send a brief outline of your approach and the estimated timeframe you’d ...
I help buyers who do not have the time or skills to dig through find the right used-car ads and, most importantly, obtain the direct contact details of each seller. To keep up with demand I now need a reliable freelancer who can fetch those contacts quickly and accurately, ad after ad, and hand them back to me in a clean, ready-to-use format. Here is what you will actually do for me: • Identify the vehicle listings I specify (make, model, price range or region will vary from day to day). • Retrieve the seller’s published phone number and/or email exactly as it appears in the ad. • Note the ad’s URL, title, price and posting date so my client can reference it. • Deliver everything in a simple CSV or Google Sheet that I can forward directly. I am no...
I need help pulling mixed text-and-number records from several online databases, then running a thorough clean-up and verification pass on everything. The tables currently contain duplicates, inconsistent formats, and the occasional missing field, so the job is to bring them to a fully validated, analysis-ready state. You will: • Export or scrape the assigned datasets from the provided URLs or API keys. • Standardise dates, numbers, and naming conventions, remove duplicates, and flag suspect values. • Cross-check totals and key fields against the source to confirm accuracy. • Return a final, well-structured spreadsheet (CSV or Excel) plus a brief log of changes and any anomalies you could not resolve. I’m working on a tight turnaround, so clarity, speed,...
I have a collection of raw text records that need to be processed so they are accurate, consistent, and ready for downstream use. The task is strictly data processing—specifically text-based data cleaning. You will work with a spreadsheet (or CSV) containing entries that suffer from extra spaces, inconsistent casing, misspellings, and occasional duplicate rows. Your job is to: • Remove duplicates without losing any unique information • Standardize capitalization and spacing • Correct obvious typos using context (English only) • Flag any ambiguous or incomplete lines for my review Please return a cleaned file in the same format plus a short change log summarizing what was fixed. I prefer work done in Excel or Google Sheets, but Python (pandas) scripts ...
I need my mixed numerical-and-text data tidied up and entered into a structured Excel workbook. The raw files include duplicates, inconsistent capitalisation, stray spaces, and a few obvious format issues, so the first step will be a careful cleaning pass. Once the data is consistent, you’ll paste or import everything into the sheet I provide, making sure each record lines up with the correct headers. Please rely on the usual Excel toolkit—TRIM/PROPER, Find & Replace, remove duplicates, Data Validation rules, and any simple formulas or Power Query steps that keep the sheet error-free for future updates. Accuracy matters more than speed; every value must match the source once cleaned. Deliverable: one .xlsx file ready for immediate use, plus a brief change log describing t...
I currently have a Python-based data scraper built with Selenium/Requests, but it is running too slow and crashing because of high memory usage while processing large datasets. It is not scalable. Requirements: Optimize memory footprint for 5,000+ records. Refactor sync loops to Asyncio/Aiohttp. Implement a robust error-handling and retry mechanism. Need an expert who can handle high-performance Python code. No beginners please.
We are hiring remote contributors to create photo-based language data using everyday materials found around you. This project focuses on collecting natural, real-life text captured through a phone camera. What You’ll Do - Photograph common objects that contain written text (printed or handwritten). - Provide three unique shots per item, changing position, distance, or lighting. - Ensure content is original and varied. - Most of the visible text (minimum 75%) must be in your local language. Eligibility - Fluent in the target language (native or near-native). - Physically located in a country where the language is used. - Own a smartphone capable of taking clear photos. How It Works - Upload images through a Google Form. - Submissions are reviewed individually. - Only valid, clear, ...
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Learn how to hire and collaborate with a freelance Typeform Specialist to create impactful forms for your business.
A complete guide to finding, hiring, and working with a skilled freelance typist for your typing projects.