26 Jun, 2009, Hades_Kane wrote in the 21st comment:
Votes: 0
Davion said:
-You cannot opt out of being crawlled by our crawler unless you remove your listing
-Eventually the crawler will support a delay feature
-At the current time, the crawler goes every half hour.


I imagine it shouldn't be a big deal to just code in a check specifically for that IP not to log. Unfortunately, I won't have time to do this until after I get back from my vacation…

Also, is there an article or anything official on the site that deals with implementing Mud Bytes specific MSSP? Or is everything still enough in the beta/planning stages that's not much of a concern?


edited to add:
In fact, if there ends up being an official article on implementing MSSP for Mud Bytes, I think it would be nice to have included in there codebase specific ways to prevent that connection from logging. I'd be happy to contribute to that for ROM with whatever method I decide to do that :)
26 Jun, 2009, Davion wrote in the 22nd comment:
Votes: 0
http://www.mudbytes.net/index.php?a=arti... is how we have it implemented currently.
26 Jun, 2009, Davion wrote in the 23rd comment:
Votes: 0
HK said:
I think it would be nice to have included in there codebase specific ways to prevent that connection from logging. I'd be happy to contribute to that for ROM with whatever method I decide to do that :)


I most certainly will not! The only reason why is that makes your logs inconsistant. Say your MUD listing gets removed because our crawler claims it isn't connecting to you (some bug or something). You get pissed, and we say "not our fault!" and you lack the logs to show us we've made an error. This is just one example.

I don't get why this log thing is a problem. Logs are -supposed- to show everything and be verbose. For example, at this moment in time a MUD that will remain nameless is trying to connect to IMC. It fails ever 50 seconds and tries to reconnect… all part of running several public servers. You should see our apache logs ;).
26 Jun, 2009, Hades_Kane wrote in the 24th comment:
Votes: 0
Davion said:
HK said:
I think it would be nice to have included in there codebase specific ways to prevent that connection from logging. I'd be happy to contribute to that for ROM with whatever method I decide to do that :)


I most certainly will not! The only reason why is that makes your logs inconsistant. Say your MUD listing gets removed because our crawler claims it isn't connecting to you (some bug or something). You get pissed, and we say "not our fault!" and you lack the logs to show us we've made an error. This is just one example.


Fair enough :p

Quote
I don't get why this log thing is a problem. Logs are -supposed- to show everything and be verbose. For example, at this moment in time a MUD that will remain nameless is trying to connect to IMC. It fails ever 50 seconds and tries to reconnect… all part of running several public servers. You should see our apache logs ;).


Once there is more going on in my game, it won't bother me so much. Its mostly just clutter in there right now that is mostly just obscuring any relevant information I might be trying to get, and considering the (especially lately) low level of activity we normally see, there are much larger blocks of it trying to connect than you might suspect :p To put it into context, over the last 7 hours, the crawler accounts for literally 2/3 of my logfile. Once there is considerably more activity, the crawler would just amount to background noise and I don't think would even be an annoyance. I also haven't happened to have been online during the times the crawler has hit, but all incoming connections display directly to top level immortals, and if I were one of those "sit afk for hours" types like some others, that would also account for a lot of unnecessary screen spam as well. There are certainly solutions I can implement to avoid this, which is what I plan doing. The most logical I can see at this point, for me anyway (assuming I can figure this out properly), is to put a check in for that specific IP for it not to log or display to the MUD when it connects.]


edit to add:
And I don't think my situation is really that uncommon or unique, so I can imagine some other people possibly wanting to do the same, which is the reason I was suggesting giving them a helping hand :) But I understand your reasons for not wanting to ;p
26 Jun, 2009, David Haley wrote in the 25th comment:
Votes: 0
If it bugs you so much to see the crawler, and you think it's only a temporary problem for you, then why can't you fix your logs? It sounds kinda like you're asking the protocol to do something to work around a temporary issue you have (i.e. being in development and only wanting to see very specific log entries).
26 Jun, 2009, Hades_Kane wrote in the 26th comment:
Votes: 0
David Haley said:
If it bugs you so much to see the crawler, and you think it's only a temporary problem for you, then why can't you fix your logs? It sounds kinda like you're asking the protocol to do something to work around a temporary issue you have (i.e. being in development and only wanting to see very specific log entries).


To be fair, that's not what I'm asking.

I said:
"I imagine it shouldn't be a big deal to just code in a check specifically for that IP not to log. Unfortunately, I won't have time to do this until after I get back from my vacation…"

and:
"There are certainly solutions I can implement to avoid this, which is what I plan doing. The most logical I can see at this point, for me anyway (assuming I can figure this out properly), is to put a check in for that specific IP for it not to log or display to the MUD when it connects.]"

You aren't guilty of skimming after chiding me for it, are you? :wink:
26 Jun, 2009, Davion wrote in the 27th comment:
Votes: 0
I don't want to split the thread again (as I'm on vacation too, atm!) so lets just get back to talking about the MSSP crawler. Thanks!
26 Jun, 2009, Cratylus wrote in the 28th comment:
Votes: 0
stinky fist
26 Jun, 2009, David Haley wrote in the 29th comment:
Votes: 0
No, I agree with Davion that ignoring the log entry should definitely not be a standard solution. That's why I pointed at your particular case because, well, you said yourself that you had a somewhat particular case, and that this is not a general problem. Perhaps you are not asking for it for your particular case, but you certainly seem to have been advocating that this be the general approach taken by MB: you are (or have been) very highly against the suggestion that people fix their logs.

In other words, because it relates to a particular case, it should have a particular solution, not something that people are encouraged to plug in as a standard feature.
26 Jun, 2009, Cratylus wrote in the 30th comment:
Votes: 0
At the risk of derailing a bit and being forced to smell the glove, I have
to ask. Is it really common for muds to log the establishment of a
connection to the login port?

I ask because what my codebase does along these lines is log when
someone's login attempt fails or succeeds. In other words, someone
just connecting to the port and disconnecting is not seen by me, because
it's frankly not of interest to me. For the very reason discussed…the only information
I expect to see from someone not-even-trying-to-log-in is random interbutt spam.

I guess what I'm saying is…aside from Davion's fanciful imagining of anyone
ever having to make any effort to prove him wrong, what advantage is there
to logging things going bump in the night against your login port?

-Crat
http://lpmuds.net
26 Jun, 2009, Davion wrote in the 31st comment:
Votes: 0
All Diku MUDs log the initial connection (AFAIK anyways). It's probably the only time someone sees the IP before gethostbyaddr is called and gives it some other value.
26 Jun, 2009, Guest wrote in the 32nd comment:
Votes: 0
Cratylus said:
At the risk of derailing a bit and being forced to smell the glove, I have
to ask. Is it really common for muds to log the establishment of a
connection to the login port?


Sadly it seems to be more or less the default behavior of all Dikurivatives. Logging the initial connection even when no further input is passed to it. I got tired of seeing this sort of thing on Alsherok, largely due to listing sites who validated the game's uptime, but also due to people who left autoconnecting clients up and walked away from the computer. Then something would happen - we'd crash or a reboot or something comes along, and they'd be there, connecting like rabid fiends, once every 5 minutes as the clients would time out. Never trying to log on. So I coded around it and only logged connections from folks who actually went to the trouble of talking to the link. Log spam dropped by 90%. It's now standard fair in AFKMud, you'd have to code around it to get the spam BACK.
26 Jun, 2009, Hades_Kane wrote in the 33rd comment:
Votes: 0
David Haley said:
you are (or have been) very highly against the suggestion that people fix their logs.


You misunderstand, and at the risk of rehashing this again… I was against people continually suggesting to Mabus "fix your MUD" solutions when he made it clear that those weren't the types of answers he was looking for, that he was simply asking for a way to opt-out without having to modify his code.

I could go through and quote numerous instances of me actually agreeing that the solutions presented are in fact wise and a better route, if necessary, but I'm not trying to derail the thread a second time ;p

I was also merely suggesting that be a potential addition to the article on MSSP that if people don't want their logs to pick up on the crawler, that it might save them some work if some examples were provided, but again, I said to Davion "fair enough" in regards to his objections to it, and have dropped that since.

I'm looking to apply a particular solution to my particular problem :p
26 Jun, 2009, David Haley wrote in the 34th comment:
Votes: 0
HK, I'm not sure if you're actually asking me a question or what you're expecting me to say, so I'll leave it there for now.
Cratylus said:
I guess what I'm saying is…aside from Davion's fanciful imagining of anyone
ever having to make any effort to prove him wrong, what advantage is there
to logging things going bump in the night against your login port?

I think it's there only because it's always been there, and people have gotten used to seeing it. :shrug: Not trying to be facetious.

I have trouble imagining why it would be useful unless you're trying to debug something fairly specific. (e.g., are people even able to ping my port, before failing to log on)
26 Jun, 2009, Davion wrote in the 35th comment:
Votes: 0
Well, like I said. It's the only instance where the IP is visiable before gethostbyaddr changes is. Granted, you can just store the information for later, but no one does. So if you actually want the IP of someone, you'll have to keep it in or set up some other form of logging them.
26 Jun, 2009, David Haley wrote in the 36th comment:
Votes: 0
Well, sure. But that sounds like an extraordinarily easy thing to fix, ne? :wink:
26 Jun, 2009, Hades_Kane wrote in the 37th comment:
Votes: 0
David Haley said:
HK, I'm not sure if you're actually asking me a question or what you're expecting me to say, so I'll leave it there for now.


I can't help but to feel that words were being placed in my mouth that never did come out, due to a likely misinterpretation of my stance in that other thread. Particularly in a medium like text where so much can be lost and misinterpreted, I think it necessary at times to try to be as concise as possible and take any opportunity to clear up any misunderstanding.

I don't want anyone thinking that I was advocating the position that "fix your MUD" solutions were bad. It was simply a matter of Mabus not looking for that, and in my opinion everyone trying to cram that down his throat while ignoring his initial question/request along with subsequent "that's not what I'm looking for" statements.

I hope this clears up any confusion on the matter, which is my -only- agenda in continuing to post about it.


On the IP logging, I find it useful to be able to see the initial connection attempts for at least a few reasons. Off of the top of my head, sometimes I recognize the IP of someone connecting I don't want to deal with, and its easier to go wizi in the time from their initial connection and presence within the game than it is from the moment they actually log in. I find it useful to know whether someone is attempting to spam the MUD with continued connections, I find it useful to have that in place when a banned IP or player attempts to connect… I'll admit there is a bit of "its always been there" mentality, and there's been very, very few instances of it being a problem. Thus far, there's only been one instance I can think of with it being a problem from someone purposely abusing it (and I think we know all what that instance was), and maybe only about 2 other instances of there being an accidental situation that made it undesirable (the bug in the crawler, and once instance of someone with an auto-connect on their client that didn't have an auto login script).
26 Jun, 2009, Davion wrote in the 38th comment:
Votes: 0
Oh. Extremely easy! Never said it wasn't. I however perfer my logs to be as verbose as possible, though. In my system, I log the IP to a new variable on the descriptor_data, as well as print out that initial sinaddr report.
27 Jun, 2009, Igabod wrote in the 39th comment:
Votes: 0
man oh man I don't think I've ever seen a thread with as much repetition on it. You guys are simply going over what has been said already adding very little to what has already been said… and you're repeating yourselves a lot too. So much redundancy. Oh yeah, and everything that has already been said is being said again. Repetitive redundancy is repetitious.

Does anybody else see what I'm talking about here?

I personally don't have this problem because my mud isn't listed anywhere at the moment, but I am very interested in finding a way to opt out or reduce frequency without having to implement MSSP. Hopefully you guys can add some new points to the conversation here and get it solved.

I'm not exactly skilled enough of a coder to know how this all works, but is there a way to just have a file called crawler.c or something in the mud that holds the data such as crawler_frequency? I'm thinking it might not be possible but I could be wrong so I decided to put this idea out there for the more skilled coders to debate over.
27 Jun, 2009, flumpy wrote in the 40th comment:
Votes: 0
Igabod said:
Does anybody else see what I'm talking about here?


absolutely - yawn. which is why i haven't contributed to this thread really.
Igabod said:
I personally don't have this problem because my mud isn't listed anywhere at the moment, but I am very interested in finding a way to opt out or reduce frequency without having to implement MSSP. Hopefully you guys can add some new points to the conversation here and get it solved.

I'm not exactly skilled enough of a coder to know how this all works, but is there a way to just have a file called crawler.c or something in the mud that holds the data such as crawler_frequency? I'm thinking it might not be possible but I could be wrong so I decided to put this idea out there for the more skilled coders to debate over.


I don't think this is possible. If the the connection is via telnet, would anyone else be able to find that file?

Oh and as a web developers point of view to the whole DOS thing, unfortunately mud applications are NOT web pages. There is a world of difference between a web server serving a single page and an application that uses telnet and multithreading to serve mud content. If I had a single web app that was being crawled every two minutes*, it would seriously degrade the performance of my application for other people. Thats why we have load balancers, multiple web app instances, content cache and apache servers that choose when to submit requests to the web apps and what to serve.

Just my tuppence, no offence or inference meant to anyone ;)

[edit - *well, maybe not every two minutes, that would be quite a long gap really. I might be more concerned about bandwidth depending on the content that required serving..]
20.0/52