Explanation:
First, please take a look at [login to view URL], its a shorten url service.
[login to view URL] offers stats by adding a ~ sign at the end of each url.
For example: [login to view URL]~
I would like a software that does the following:
1) Extracts from Google all of the [login to view URL] links.
Note: The best way is to use "[login to view URL]" in a Google search query
I should be able to edit the text file that includes the search query.
For example, if I wanted to add a different search engine (Bing, Yahoo) to the .txt file for a custom search
Example: "[login to view URL]" "[login to view URL]"
Example: "[login to view URL]" "[login to view URL]"
(this would give me only instagram [login to view URL]'s)
- Basically, I want to harvest all [login to view URL] links.
2) GUI Option in software to Checks Stats (by applying a ~ and checking the [login to view URL] link) and the link where the URL is going to and where the traffic is coming from (UK, USA, etc).
3) Option to arrange highest amount of clicks (from stats) at top of excel/txt output and erase duplicates.
4) The output should be Excel Xls or TXT. Should be one url per line if its txt.
Hello,
I can help with you in your project Scraping - Collecting Stats on ShortenURL Links
I have more than 5 years of experience in Javascript, PHP, Python, Software Architecture, Web Scraping. We have worked on several similar projects before!
We have worked on 350+ Projects. Please check the profile reviews. I can deliver your job with in your deadline. Please ping
me for more discussion.
I can assure the 100% job satisfaction.
Thanks,
Hello,
I have read and understood what has to be done in this task. I have sound experience in performing such kind of work as I have done lot of project of similar kind in the past.
I think I can assist you in getting this task done with more focus on quality and getting it delivered on time. I have my own software for scraping data.
Regards,
Nidhi
Hi!
I am a developer proficient in Python, php and Ruby, and I have experience writing web scrapers. I worked on projects similar to this one: scraping job sites, crawling the web to collect email addresses, etc.
please contact me with more info about the project. thank you