All off topic discussions go here. Everything from the funny thing your cat did to your favorite tv shows. Non-programming computer questions are ok too.
How many of us have spent hours at http://thinkgeek.com/fortune.shtml ? (If you don't know what it is spend the rest of the day there....)
Well, after a couple of hours of refreshing, you see, I got smart. I have a program running right now that will request a new copy every second and parse the quote out. Then it formats it all nice and saves it to a file for easy viewing.
I'm sad. I've written a program to continuously pull down and extract the posts here into a gigantic file which I can read at my leasure. I'm adding a feature now to sort out the repeated posts... URL to follow... find it if you can! (Feindish Grin)
Update: I'm not rewriting it to post to the site too, as that might encourage a lot of same-message spam. (in reply to someone's request)
Update to the Update: Now it has a cool setting to stop you after reading more than a specified amount of posts (defaults to 5000). It also has a feature which runs the quotes through your start bar at a speed you can specify...
Found this halfway down my 4th download file. Each download file is 350K, quotes only, seperated by "... ... ...". I have not skipped a single quote.
I need a life.
*edit*
This sucks. I can't find my quote, but I can find one that says exactly the same thing......
You would connect to "ThinkGeek.com" on port 80. Then you would send the following command through the socket:
"GET /fortune.shtml HTTP/1.0\n\n"
Then, taking into account (of course) that the incoming data is split into an average of 4 packets (TCP is streaming protocol) you gather the data of the web-page's source. Then you search it for the quote, save the quote to a file, and close the socket. Repeat.