Closed Bug 220373 Opened 21 years ago Closed 20 years ago

web-sniffer won't compile on RedHat Linux

Categories

(Webtools Graveyard :: Web Sniffer, defect)

x86
Linux
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: justdave, Assigned: erik)

References

Details

Attachments

(1 file)

It either won't compile on Linux or just won't compile...  It's been there
forever on the old Solaris box, so I assume it compiled at some point, at least
on Solaris. :)

In file included from hash.c:28:
mutex.h:25:20: thread.h: No such file or directory
mutex.h:26:19: synch.h: No such file or directory
In file included from mutex.h:29,
                 from hash.c:28:
main.h:28:19: synch.h: No such file or directory
make: *** [hash.o] Error 1
Blocks: 220351
It's quite possible we're missing a needed -devel package, anyone know which one?
From debian-Package search i got this (ok this is not redhat, but it give some
hints, which might be useful for this; because Debian has packages for nearly
everything). For synch.h it means package libace5.2-dev. This package also
includes the thread.h, so this seems to be the right one. Go look for libace-dev
for Redhat :)!
Ok under http://deuce.doc.wustl.edu/Download.html you can download a source
version of ACE. http://cvs.doc.wustl.edu/ace-cvs.cgi/ACE_wrappers/ace/ there
seem to be the two missing files. So ACE+TAO should be the right file to
download (probably there isn't a red hat rpm, is it?!)
Ok, i think this is wrong. See up/lowercase. You might try googleing for these
files?!
I just looked into the code and found no real reason why this mutex stuff should
be necessary. It is used in net.c and robot.c. In net.c it is necessary because
the used variables are static. But why are they static?
Seeing as how UNIX != Linux (despite what SCO wants to think ;) it's also quite
possible that some of the APIs being used are defined in different header files
on Linux than they are on UNIX/Solaris.
The Solaris and Linux thread api's have different names as well and may
operate differently. Lots of work to fix it.

Is there a good reason why this thing is threaded? It's purpose is 
essentially serialized anyway.
OK, I figured I'd waste a few hours. I chopped out anything that I didn't like
and I'v made it compile and work. It's a hack, an atrocity, an abomination but
it works for me. It is highly likely that I've broken other things, or reduced
performance so it will hammer the server, or introduced some  awful security
hole.

You probably don't want to run this anyway. It doesn't use CSS so it produces
ugly output.
Attached patch patchSplinter Review
Wouldn't know a thred if one bit it. Won't work on Solaris. Won't work on
Windows. Probably won't work anywhere.
btw: if someone needs a quite good replacement, try http://web-sniffer.net/
Another possibility would be to replace this with a script and use
source-highlight <http://www.gnu.org/software/src-highlite/> as a back 
end.
Easiest to redirect to web-sniffer.net IMO. It works well. justdave: can you
arrange that?

Gerv
Done.
Assignee: nobody → erik
I realize that mozilla.org's web sniffer location now redirects to another HTTP
viewer, and that's fine, but I have ported Web Sniffer to Linux, and I would like
to check in the changes. The "diff -u" output is about 2500 lines and the new
files are about 750 lines (excluding autoconf output). Should I attach the diffs
and new files to this bug for review, or should I just check it all in, given
that I am the original author of Web Sniffer?

http://webtools.mozilla.org/web-sniffer/
Status: NEW → ASSIGNED
By the way, I have registered a new domain to host my own Web Sniffer:

http://websniffer.org/
(In reply to comment #5)
> I just looked into the code and found no real reason why this mutex stuff should
> be necessary. It is used in net.c and robot.c. In net.c it is necessary because
> the used variables are static. But why are they static?

Well, most of the tools in the Web Sniffer source directory are single-threaded,
but the "robot" tool (aka web crawler, or spider) gets much better performance
when it is threaded. All the tools happen to use the same routines, and the
robot collects some statistics and measures the time it takes for a DNS lookup,
etc. In order to count various things, the multi-threaded robot must use static
variables. (Per-thread storage is another possibility, but why bother?)

I suppose I could place various hooks in the right places to call back to the
app, which can then synchronize threads if any. I'll think about it.
(In reply to comment #14)
> I realize that mozilla.org's web sniffer location now redirects to another
> HTTP viewer, and that's fine

(In reply to comment #15)
> By the way, I have registered a new domain to host my own Web Sniffer:
> 
> http://websniffer.org/

Can the redirect be changed to point to http://websniffer.org/ pending
resolution of this bug? The original mozilla web sniffer is much more useful
than the current redirectee (at least for anything that I ever use it for ;-) )
After this bug, I still have to fix bug 57556 before it can be deployed
anywhere.
Changed name from Web Sniffer to SniffURI. See sniffuri.org.

Ported to Linux. See Bonsai for file version details and log message.
Status: ASSIGNED → RESOLVED
Closed: 20 years ago
Resolution: --- → FIXED
I agree with Dave. I missed this tool -- we should get this deployed again at
mozilla.org. Thanks, Erik.
(In reply to comment #8)
> You probably don't want to run this anyway. It doesn't use CSS so it produces
> ugly output.

I've checked in changes to make it use CSS. See sniffuri.org.
QA Contact: mattyt-bugzilla → web.sniffer
Product: Webtools → Webtools Graveyard
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: