Nessus 5.0 on Backtrack 5r2

Note: I’ve not tried this on anything else so YMMV if you try this on 5r1 or less.

Fire open a terminal window and as root type:

apt-get remove nessus

This will remove the old v4.4.1 version from your backtrack instance and stop any nastyness occurring when you run the install with the two versions clashing.

Now head to http://www.nessus.org and grab yourself a copy of the latest version.

If you’re lucky enough to have a professional feed, great stuff.

If it’s your first time dealing with nessus, you’ll need to register for a homefeed so follow the steps online.

Download the package labeled as: Nessus-5.0.0-ubuntu910_amd64.deb (if you’ve a 64bit machine, else go for i386). I’ve not tested any other packages but I know the above one worked for me.

now back within the terminal window and in the location you saved the file to type:

dpkg -i Nessus-5.0.0-ubuntu910_amd64.deb

and watch as it magically installs everything you need. Upon completion nessus should be callable from the path.

Run:

nessus-fetch --version

To confirm the version number, it should come back at 5.0.0

Then using the code either for your professional feed or home feed register your nessus install:

nessus-fetch --register SERIAL_NUMBER_YOU_HAVE

Wait and it should confirm a successful registration and download the plugins.

Now just fire up nessus (first time after an update it takes an age as it unpacks and loads the plugins) and you’re away.

Not quite. What will come next is a guide as to how on earth you get Flash working in the onboard firefox on BT5R2.

Then, you should be all set for nessus scanning from your backtrack installation.

Going for the low end…

So I’ve been umming and ahhing for a while about buying a VPS to use for hosting files or playing with the IP over ICMP and DNS tunnelling system, even playing with a basic PBX setup to hook up with my sipgate account that I use for a landline.

But 3 things have stopped me from doing it in the past.

1. Security – I’m paranoid, more so now i’m in my current job about just how easy it is to break into systems without too much hassle. Heck I do it for my day job, do I really want to take on board the full management of a server that has no security other than what I apply to it.

2. Security – I’m paranoid but I’d consider myself at least proficient at my job. What’s the likelihood I’ll tweak something, knacker the install and have to rebuild it over and over just because I was trying to lock it down just that bit more.

3. Cost – VPS instances aren’t cheap for something that may be left alone for months before being used on occassion.

So I never got around to it, I did mess about once but after pooching the firewall and having to pay £15 for a server rebuild as they had no other way of accessing it and support wouldn’t just take the firewall down for me I cancelled the account and never looked again.

Until now. I’ve been browsing the entries over at www.lowendbox.com and monitoring the offers up there, understandably a lot of the offers are flashes in a pan, you pay $5 for access to an amazing VM, a week later the company stops trading and starts up another scam.

However I came across a post about a company called “quickpacket” offering a very basic low end VM (128mb ram, 256mb vSwap, 20GB HDD, 500GB transit) for $15 for 12 months.

At just over $1 a month I figured I could suck that up if they disappeared overnight, but I read into them. They’ve been trading quite a while now, looks like a seller of webspace primarily but have moved into VPS hosting.

So yes, I am now the proud new owner of a mini-VPS. Stay tuned for developments but on the cards i’m thinking private svn repository, ICMP Tunnelling and maybe a play about with that PBX idea.

P.S. I’ve already locked myself out of it at least 10 times, thank goodness they have other means to access it and a great big “rebuild” button for when you absolutely hosed the box, no additional costs 🙂

NFTF: Extracting the important bits from wsusscn2.cab

Working on a script for extracting MS numbers for patches for work.

The following command allows 7zip to extract the needed files without extracting the hundreds of thousands of other items in a giant lump.

c:Program Files7-Zip7z.exe" x -ir!x/* cabs/package*.cab

Bloody useful

Tip of the day: Logical Syntax :)

Neat little way of thinking about logical vs syntax errors.

Ever had to hunt high and low for a reason why something is not working as intended? Ever had an if statement that always evaluates as true?

Yes?

So, all of you are probably aware that if the IF statement evaluates as true all the time chances are you’ve used an assignment operator instead of a comparison (aka = instead of ==). Thing is, in a mountain of code it can be a nightmare to find that out, so how do you prevent it from ever being a problem?

Think backwards.

Instead of:
IF ( favChocolate == “buttons”){ echo “He likes Cadbury buttons!” }

Use:

IF (“buttons” == favChocolate){ echo “he likes Cadbury buttons!” }

What does this do? Well in non-interpreted languages if you accidentally type:

If (“buttons” = favChocolate){… then it will result in a build error. You can’t assign a variable to a string.

If its interpreted it’ll result in a runtime error as again, you can’t assign a variable to a string J

Thus eliminating the guess work involved in finding logical errors in your tests.

Bash “while read line” Vs Awk Large File Processing

Recently I had to fudge some data so that it would be imported into a database after an outage caused our “php” data loader to try and allocate a crazy amount of memory and die fantastically.

Being a fan of automating everything I can I started out down the trail of “okay lets script this”.A few moments later I had a simple bash script looking somewhat like:

#!/bin/shfilename=$1while read line; do#Read each line and grab the necessary fields, create the insert statements.     field1=`echo ${line} | awk {'print $1'}`     field2=`echo ${line} | awk {'print $7'}`     echo "INSERT INTO testtable VALUES ('${field1}',UNIX_TIMESTAMP($field2}));" > data.indone < ${filename}#Assume all is good and just feed the file to mysql for processing.mysql -u root testdatabase < data.in# EOF

Simplistic script – using my favourite awk statement for field delimitation (so much easier than relying on cut). Now im aware awk can do a lot more however everytime I pick up a book or decide to learn some I end up being told “ITS URGENT!” and having to drop everything to “just get it working”.

Its a common problem I feel when you’re really just support staff trying to keep everything hunky-dory.

This time around while work finished at 17:30, it was now 19:00 and my script was still running having been set off at about 10am. A quick wc -l and some dodgy division told me that it still had about another 56 hours to run. I was processing 2,647,012 lines and wasn’t even above 500,000 lines yet.

Although I had backgrounded the process I didn’t want it to fail without me knowing. Anxious and having already done 1.5hrs of unpaid overtime I decided to see if there was a better way of doing the job.

Sure enough AWK reared its head again its a turing-complete programming language specifically meant for text processing so why not have a decent look at it now that everyone else in the office has gone home and you’re stuck here until its done.

So a poke around the internet for a bit of guidance in AWK and I come out with a solution of.

#!/bin/shfilename=$1awk BEGIN{     # Special characters represented by octal values to prevent any escaping issues.     q="47" # single quotation mark     lb="50" # left bracket     rb="51" # right bracket     c="54" # comma     sc="73" # semi-colon}{     print "INSERT INTO testtable VALUES " lb q $1 q c "UNIX_TIMESTAMP(" $7 rb rb sc >> data.in} ${filename}mysql -u root testdatabase < data.in#EOF

Time taken for processing: under 90 seconds.

Time taken for bash to process: Over 60 hours.

Just had to wait then for mysql to catch up and import the 2.6 million entries. Left work at 10:30pm with the solution in place.

There was a further issue with it. Every hour we collect new data and normally the php data loader works through those few thousand lines in a few minutes and no issues, however with the backlog we had mounted up it could no longer be trusted. Unfortunately the bash solution also took so long that while it processed entries the “queue” of entries got longer every hour as it was falling behind. Now with the awk solution in place it takes less than 30 seconds to get the entries into the database.

Lesson Learned: Never use bash while loops for iterating through large text files.