Good work Debian
Feb 27, 2004
6:02 PM EST
|The sooner their kernels are updated the better. I just started managing a research department who has standardized on Linux on the desktop.
Dave (if you're listening...) - I listened to your interview at TheLinuxShow, and you said that you're aggregating rss and html screen-scraped feeds. How did you do this one? email scraping?
BTW- I like your site. I'm a fan of linux and php. If you're interested, check out my article at phparch.com. It appeared in the May 2003 edition which is a free download right now. I'm curious about the code behind lxer...
Feb 28, 2004
4:01 AM EST
|Yes, I got this from "email scraping". I'm subscribed to all the announce/alert mailing lists for more of the distributions, and when the mail comes in, procmail automatically sends it to my LXer robot which then makes a decision, checks off the categories, and inserts it into the unposted queue.
Then I edit it to add a lead paragraph, and then approve it. Takes about 15 seconds of my time, total! :)
If you want to talk about the code at LXer go here: http://lxer.com/module/newswire/view/1/index.html
Post any questions to that story and we'll discuss there. :)
Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]
Becoming a member of LXer is easy and free. Join Us!