Jump to content
UBot Underground

Creating an online drip feed service


Recommended Posts

I was thinking about creating an online service for links (similar to drip feed blasts). What all would be required to create this, or is this something that uBot simply doesn't have the power to do and I should outsource it?

Link to post
Share on other sites

You could do this very easily with ubot.

 

Here's how I might do it:

 

Store my data (urls*) inside a text file n my vps c drive.

 

Set up a scheduled task to run between the hours of 9-5pm every hour.

 

When the software runs at 9am it grabs the list of urls from the text file.

 

It takes the fist URL in the list and posts that to a Facebook account and/or a twitter account or wherever you want.

 

Then, remove that URL from the list.

 

The next hour it will run the same thing again.

 

*replace urls with whatever data you want.

Link to post
Share on other sites

it is doable you just need database storage to track everything easily!

you can store everything on your Mysql server and make both web interface and your bot communicate to the same database !

Link to post
Share on other sites

it is doable you just need database storage to track everything easily!

you can store everything on your Mysql server and make both web interface and your bot communicate to the same database !

 

Thanks Aymen, any idea how I could communicate with the mysql database, especially after it is on my VPS? Tell me, since I see that you have created many bots over time, do you think this is something that seems really feasible? As in, is it something that you would create. The way I see it, there is quite a lot that needs to be done, registration on forums/clicking the verification emails/ - (basically stuff that xrumer would do); similarly has to be done for social bookmarking websites. Then if I want blog commenting, that needs to be incorporated in as well.

 

 

 

You could do this very easily with ubot.

 

Here's how I might do it:

 

Store my data (urls*) inside a text file n my vps c drive.

 

Set up a scheduled task to run between the hours of 9-5pm every hour.

 

When the software runs at 9am it grabs the list of urls from the text file.

 

It takes the fist URL in the list and posts that to a Facebook account and/or a twitter account or wherever you want.

 

Then, remove that URL from the list.

 

The next hour it will run the same thing again.

 

*replace urls with whatever data you want.

 

 

Hey Kev, thanks for the reply. this seems easy, but same question to you as I asked above. Do you think its feasible, or is there some way to shorten the time for creation of accounts/email verification etc. - I think what Im asking is whether I would need this to run scrapebox too, or buy some other bots from someone else so that I wouldn't need to do everything

Link to post
Share on other sites
  • 2 weeks later...

It's possible you just need to store the jobs in a database and then have the bot(s) pick up the jobs and do them. For example, if you have 50 clients and they each need 1k links per day then every single day at a certain time you have a PHP script run via a cron job at a certain time everyday and it will process all the current customers and add them into a table in a database. Shortly after that the bots wake up and start to look for jobs in the database. Then the bots pick up the jobs by changing the status from not complete to in progress in the database, when their done they can remove that row or change it to complete.

 

You will probably need to run each bot on a separate vps if you are using more than one. Ideally you would be pretty familiar with working with databases and know PHP because you need to have the web interface side of things where people can add in urls and keywords and schedule postings.

Link to post
Share on other sites

I'm trying something like this ... but different ... have to scrap some data from one web and post it to another one ... multiple users, so I decide to do all the backend in RAILS, event the page scraper will run in Ruby, it will generate the orders for the bot.

 

This way I get a lightweight bot ... not need to do the scrap part, only look to some web interface, get the orders and run them.

 

And yes you have to know about a scripting language like PHP, Rails, etc and DB like Mysql for example.(or sqllite)

So I put some general orders in the back-end, it will scrape the values put them in the DB, then the bot run and get the tasks, very simple but effective.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...