Jump to content
UBot Underground

Really Simple But Cool Trick


Recommended Posts

I just discovered, through long arduous attempts to solve a relatively complex problem, that the solution was a digression into simplicity. There are NUMEROUS scenarios you can apply this to. Here was mine:

 

I was trying to navigate through a sites pages using the "next" function of the nav bar. What I was looking for I knew I would find, but the trick was not knowing where I would find it (on what page).

 

There were no single unique identifiers to help identify what page you were on. It was going to require a lot of scraping and scrubbing down of scraped data. The problem was two-fold. I needed to locate the link and IF it met a certain condition (the page number was key to this condition) the bot would perform a certain action.

 

Despite many failed scrapes, if/then's and evaluates it suddenly hit me. as soon as you have navigated to what would be considered page 1 of the nav structure, you start a list (in my case I simply added the $url). Every time I hit next I added $url to the list...you see?

 

This, by $list total, was keeping my EXACT page count (If there are 4 entries on the list, I am on page 4!)

 

In this case I could not perform an action unless the item I was looking for was past page 6. So I set the variable #threshold to 6

 

When it found what it was looking for it simply evaluated the $list total against the threshold to make sure the $list total was greater. In simpler terms:

 

If-----search page(my item) Then---If---Evaluate $list total > #threshold---Then---Do action

 

BTW, to solve my issue this way took exactly 3 small nodes of scripting where I had about 12 using other more common methods.

 

Another good use I could think of. There are many of you who create scrapers, friend adders, etc, etc. With this simple technique you can have it check as many pages as you want (or if the site has a limit, set it to not exceed a certain number, etc.

 

I couldn't believe how simple this was, and if you were able to follow my gibberish explanation you will understand how this can replace some pretty complex code.

 

I hope someone will find this helpful.

 

John

  • Like 1
Link to post
Share on other sites

Hey John

It sure is a very cool trick and I have been using it for a little time now scraping data from some bastard sites, it's simple and quite effective at times.

 

Praney :)

 

I just discovered, through long arduous attempts to solve a relatively complex problem, that the solution was a digression into simplicity. There are NUMEROUS scenarios you can apply this to. Here was mine:

 

I was trying to navigate through a sites pages using the "next" function of the nav bar. What I was looking for I knew I would find, but the trick was not knowing where I would find it (on what page).

 

There were no single unique identifiers to help identify what page you were on. It was going to require a lot of scraping and scrubbing down of scraped data. The problem was two-fold. I needed to locate the link and IF it met a certain condition (the page number was key to this condition) the bot would perform a certain action.

 

Despite many failed scrapes, if/then's and evaluates it suddenly hit me. as soon as you have navigated to what would be considered page 1 of the nav structure, you start a list (in my case I simply added the $url). Every time I hit next I added $url to the list...you see?

 

This, by $list total, was keeping my EXACT page count (If there are 4 entries on the list, I am on page 4!)

 

In this case I could not perform an action unless the item I was looking for was past page 6. So I set the variable #threshold to 6

 

When it found what it was looking for it simply evaluated the $list total against the threshold to make sure the $list total was greater. In simpler terms:

 

If-----search page(my item) Then---If---Evaluate $list total > #threshold---Then---Do action

 

BTW, to solve my issue this way took exactly 3 small nodes of scripting where I had about 12 using other more common methods.

 

Another good use I could think of. There are many of you who create scrapers, friend adders, etc, etc. With this simple technique you can have it check as many pages as you want (or if the site has a limit, set it to not exceed a certain number, etc.

 

I couldn't believe how simple this was, and if you were able to follow my gibberish explanation you will understand how this can replace some pretty complex code.

 

I hope someone will find this helpful.

 

John

Link to post
Share on other sites

Yeah, it gave me one of those "slap myself in the head" moments for not thinking of it sooner. We tend to think if the problem looks complex, then it must need a complex solution!

Link to post
Share on other sites

Yeah, it gave me one of those "slap myself in the head" moments for not thinking of it sooner. We tend to think if the problem looks complex, then it must need a complex solution!

 

Hey John

 

That is very true, but I now take a break for 5 mins if I get stuck somewhere, just a stress reliever and that helps me think smarter than harder ;)

 

We all get stuck at some point or other.

 

Cheers bro!

 

Praney :)

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...