- a beautiful place

Team/site updates for 2008

April 4, 2008 at 12:32 am - Filed under Team, Language EN - 535 words, reading time ~1 minutes - Permalink - Comments

As you probably noticed was pretty quiet in the last 5/6 months, this happened because there were cyclic dependencies in my todo list. Well now the situation is unblocked again and you can, perhaps, expect new posts! Just to reply to the "no more updates/why don't you post more/etc" sentences category i would remind that here we publish our research and naive contents and most likely we are not going to comment/bounce/mirror everything happening in this amazing world. A direct consequence is that will never have daily, regular or forced updates.

Most important news are: updated wordpress version, mostly custom hacks are now self-contained and pluggable (easier maintenance/upgrading), diff/patch files for the other hacks (more than the 2% of the whole WP codebase), patch maker script to better upgrade next time (the script diffs the base version of WP used in production with the latest version and produces a patch file that is used to upgrade the custom version), development testbed and whitelist mod_security rules (more than 1200 lines!!).

The hardest part was to reorganize the custom code and back/"future"-fixes that was hardcoded during the years in our WP instance while the funniest was the mod_security stuff, it took about 5 days alone.

Mod Security in a whitelist setup allows you to specify permitted hader names and format, cookies, GET/POST parameters and format. Everything else hits the fallback deny rule or is explicitly denied.

The choice was to start with configuration and headers check, outside any block. These rules will be always evaluated and explicitly deny headers that doesn't match. A sequence of LocationMatch blocks follows with the aim to allow categories, articles, archives and pages defining permitted HTTP methods and parameters. LocationMatch was not the only option but since almost everything here is url-rewrited it was nearly the only way to go. Another bonus of using LocationMatch is that you can group many "virtual" pages under the same block of rules, saving from repetition.

Personally i learned a lot of tricks and hacks doing this work but on the other side whitelist assures a certain amount of blocked requests that are not attacks but simply exhibit headers in plus or less, strange formats, etc. When this happens for normal pages, the ones you expect to be viewed by a browser, my policy is "worse for you", rules should already work any sane mayor browser (Firefox, IE, Opera, Links, Amaya, etc.) but what when denied requests are on RSS feeds for example? Observation of audit logs showed that almost every RSS reader add retarded headers to the request (stuff like I-AM: feed). But why? Don't you have already the User-Agent? Regardless this i'm working to fix this.

If a legit application of yours gets blocked or you incur in problems duiring normal navigation drop me a note, i'll greatly appreciate!

This temporally block was naturally limited to the site, research continued and the site will be updated with published and new stuff made by us :-)

Thanks for the reading,
ascii and the USH team.

Reed's Alert! Got something burning? Tell USH team.
THP USH Wisec DigitalBullets